In this project we will try to predict closing weekly price of Corn Commodity Futures. In order to perform this prediction we will create a dataset that includes weekly Corn Futures closing prices as well as Long Open Interest and Short Open Interest of Processors/Users( sometimes they are called Commercials) from COT reports and by using this dataset we will try to predict next week’s prices.
Historical Futures Prices: Corn Futures, Continuous Contract #1. Non-adjusted price based on spot-month continuous contract calculations. Raw data from CME:
Can be found here
Commitment of Traders - CORN (CBT) - Futures Only (002602)
Can be found here
Data has been downloaded and stored in \Data folder:
import warnings
warnings.filterwarnings('ignore')
import pandas as pd
import numpy as np
from IPython.core.display import display, HTML
pd.options.display.max_colwidth = 500 # You need this, otherwise pandas
# will limit your HTML strings to 50 characters
pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)
pd.options.mode.chained_assignment = None # default='warn'
from matplotlib import pyplot
from sklearn.preprocessing import MinMaxScaler
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from math import sqrt
from numpy import concatenate
from sklearn.metrics import mean_squared_error
import matplotlib.pyplot as plt
from plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot
import cufflinks as cf
import plotly.tools as tls
init_notebook_mode(connected=True)
cf.go_offline()
Using TensorFlow backend. C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:558: DeprecationWarning: plotly.graph_objs.YAxis is deprecated. Please replace it with one of the following more specific types - plotly.graph_objs.layout.YAxis - plotly.graph_objs.layout.scene.YAxis C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:531: DeprecationWarning: plotly.graph_objs.XAxis is deprecated. Please replace it with one of the following more specific types - plotly.graph_objs.layout.XAxis - plotly.graph_objs.layout.scene.XAxis
df_fut_orig = pd.read_csv('data\CHRIS-CME_C1.csv')
df_fut_orig.head(n=5)
| Date | Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 2018-07-10 | 344.25 | 344.75 | 336.25 | 339.50 | 6.00 | 339.75 | 2668.0 | 2186.0 |
| 1 | 2018-07-09 | 346.00 | 348.50 | 342.50 | 346.00 | 6.00 | 345.75 | 3190.0 | 2969.0 |
| 2 | 2018-07-06 | 342.00 | 352.25 | 342.00 | 350.75 | 8.25 | 351.75 | 3068.0 | 3959.0 |
| 3 | 2018-07-05 | 345.50 | 348.75 | 341.50 | 342.50 | 0.75 | 343.50 | 3302.0 | 4812.0 |
| 4 | 2018-07-03 | 340.25 | 345.25 | 339.25 | 343.25 | 5.25 | 342.75 | 3048.0 | 5687.0 |
# Display a description of the dataset
display(df_fut_orig.describe())
| Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|
| count | 3033.000000 | 3034.000000 | 3034.000000 | 3034.000000 | 1081.000000 | 3034.000000 | 3034.000000 | 3034.00000 |
| mean | 457.095038 | 462.322924 | 451.795485 | 456.920040 | 3.950324 | 456.979318 | 103905.200396 | 352140.90145 |
| std | 140.338892 | 142.056030 | 138.436196 | 140.243019 | 3.415126 | 140.204571 | 73993.219920 | 248565.85531 |
| min | 219.000000 | 220.750000 | 216.750000 | 219.000000 | 0.000000 | 219.000000 | 0.000000 | 107.00000 |
| 25% | 360.000000 | 363.000000 | 356.250000 | 359.500000 | 1.500000 | 359.750000 | 40172.750000 | 107559.25000 |
| 50% | 388.500000 | 392.000000 | 383.500000 | 388.750000 | 3.000000 | 389.000000 | 102567.000000 | 365073.00000 |
| 75% | 565.500000 | 573.562500 | 557.375000 | 564.625000 | 5.500000 | 564.625000 | 152391.250000 | 556408.50000 |
| max | 830.250000 | 843.750000 | 822.750000 | 831.250000 | 30.750000 | 831.250000 | 538170.000000 | 858696.00000 |
df_fut_orig['Date'] = pd.to_datetime(df_fut_orig['Date'])
df_fut_orig.set_index('Date',inplace=True)
df_fut_orig = df_fut_orig.sort_values('Date')
Plot Corn Futures Price Series using Plotly
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_original_price_series(df_fut_orig)
Seems there are some rows where Volume=0, lets find out more about these rows
df_fut_orig[df_fut_orig['Volume']<1]
| Open | High | Low | Last | Change | Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|---|---|---|---|---|
| Date | ||||||||
| 2007-04-05 | 359.75 | 367.50 | 357.25 | 366.00 | NaN | 366.00 | 0.0 | 354349.0 |
| 2012-04-06 | 658.25 | 658.25 | 658.25 | 658.25 | NaN | 658.25 | 0.0 | 401521.0 |
| 2015-04-03 | 386.50 | 386.50 | 386.50 | 386.50 | NaN | 386.50 | 0.0 | 470964.0 |
Since we will resample daily prices into weekly prices , lets drop those rows.
# drop outliers
df_fut_orig.drop(df_fut_orig[df_fut_orig.Volume<1].index, inplace=True)
df_cot_orig = pd.read_csv('data\CFTC-002602_F_ALL.csv')
display(df_cot_orig.head())
| Date | Open_Interest | Producer_Merchant_Processor_User_Longs | Producer_Merchant_Processor_User_Shorts | Swap Dealer Longs | Swap Dealer Shorts | Swap Dealer Spreads | Money Manager Longs | Money Manager Shorts | Money Manager Spreads | Other Reportable Longs | Other Reportable Shorts | Other Reportable Spreads | Total Reportable Longs | Total Reportable Shorts | Non Reportable Longs | Non Reportable Shorts | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2018-07-10 | 1818055.0 | 500172.0 | 750062.0 | 208128.0 | 39513.0 | 99477.0 | 263353.0 | 404297.0 | 154286.0 | 320946.0 | 70682.0 | 98709.0 | 1645071.0 | 1617026.0 | 172984.0 | 201029.0 |
| 1 | 2018-07-03 | 1830330.0 | 484257.0 | 773851.0 | 210341.0 | 36927.0 | 100340.0 | 274795.0 | 382191.0 | 149756.0 | 322256.0 | 66508.0 | 119627.0 | 1661372.0 | 1629200.0 | 168958.0 | 201130.0 |
| 2 | 2018-06-26 | 1885804.0 | 513100.0 | 840177.0 | 223131.0 | 32763.0 | 91972.0 | 287061.0 | 377825.0 | 153461.0 | 330396.0 | 58283.0 | 116745.0 | 1715866.0 | 1671226.0 | 169938.0 | 214578.0 |
| 3 | 2018-06-19 | 1992169.0 | 525197.0 | 920764.0 | 222105.0 | 41144.0 | 99285.0 | 299377.0 | 356828.0 | 163454.0 | 379025.0 | 56652.0 | 135078.0 | 1823521.0 | 1773205.0 | 168648.0 | 218964.0 |
| 4 | 2018-06-12 | 1963233.0 | 488666.0 | 917204.0 | 235249.0 | 37674.0 | 93281.0 | 292054.0 | 304292.0 | 172623.0 | 363918.0 | 65030.0 | 147098.0 | 1792889.0 | 1737202.0 | 170344.0 | 226031.0 |
display(df_cot_orig.describe())
| Open_Interest | Producer_Merchant_Processor_User_Longs | Producer_Merchant_Processor_User_Shorts | Swap Dealer Longs | Swap Dealer Shorts | Swap Dealer Spreads | Money Manager Longs | Money Manager Shorts | Money Manager Spreads | Other Reportable Longs | Other Reportable Shorts | Other Reportable Spreads | Total Reportable Longs | Total Reportable Shorts | Non Reportable Longs | Non Reportable Shorts | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 6.310000e+02 | 631.000000 | 6.310000e+02 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 631.000000 | 6.310000e+02 | 6.310000e+02 | 631.000000 | 631.000000 |
| mean | 1.292201e+06 | 270795.049128 | 6.268425e+05 | 290792.497623 | 20337.034865 | 33260.068146 | 236884.269414 | 137472.426307 | 94546.356577 | 140931.890650 | 70914.334390 | 85505.109350 | 1.152715e+06 | 1.068878e+06 | 139485.541997 | 223322.976228 |
| std | 2.095471e+05 | 68976.221600 | 1.554272e+05 | 53203.484072 | 18944.008732 | 22912.567257 | 67454.195123 | 109465.025186 | 32739.133163 | 51939.690903 | 26360.863384 | 29682.425476 | 1.939790e+05 | 2.060080e+05 | 23718.957966 | 29824.710288 |
| min | 7.482520e+05 | 102373.000000 | 2.972960e+05 | 186981.000000 | 0.000000 | 4397.000000 | 96989.000000 | 6714.000000 | 29130.000000 | 49809.000000 | 25905.000000 | 27592.000000 | 6.379810e+05 | 5.689510e+05 | 78578.000000 | 156086.000000 |
| 25% | 1.192226e+06 | 226595.000000 | 5.235930e+05 | 255196.500000 | 6524.000000 | 13978.000000 | 186366.500000 | 47947.000000 | 72018.500000 | 104764.000000 | 53331.000000 | 62690.000000 | 1.055362e+06 | 9.573815e+05 | 121829.500000 | 198860.500000 |
| 50% | 1.301506e+06 | 262823.000000 | 6.112810e+05 | 276337.000000 | 15239.000000 | 27209.000000 | 225682.000000 | 95548.000000 | 91850.000000 | 140343.000000 | 66261.000000 | 82705.000000 | 1.166372e+06 | 1.067548e+06 | 136966.000000 | 227337.000000 |
| 75% | 1.398275e+06 | 314224.000000 | 7.058555e+05 | 321265.500000 | 28178.000000 | 48009.500000 | 287331.000000 | 211154.000000 | 113803.000000 | 175846.000000 | 83448.500000 | 106077.500000 | 1.247976e+06 | 1.180280e+06 | 153542.500000 | 246903.000000 |
| max | 1.992169e+06 | 525197.000000 | 1.001517e+06 | 422803.000000 | 95591.000000 | 113775.000000 | 431569.000000 | 447470.000000 | 231064.000000 | 379025.000000 | 173322.000000 | 181385.000000 | 1.825238e+06 | 1.773205e+06 | 206821.000000 | 293948.000000 |
df_fut=df_fut_orig.drop(columns=[clmn for i,clmn in enumerate(df_fut_orig.columns) if i not in [5,6,7] ],axis=1)
display(df_fut.head())
| Settle | Volume | Previous_Day_Open_Interest | |
|---|---|---|---|
| Date | |||
| 2006-06-16 | 235.50 | 56486.0 | 203491.0 |
| 2006-06-19 | 229.75 | 51299.0 | 190044.0 |
| 2006-06-20 | 229.75 | 41605.0 | 175859.0 |
| 2006-06-21 | 232.75 | 29803.0 | 162348.0 |
| 2006-06-22 | 230.50 | 28687.0 | 147658.0 |
s_settle =df_fut['Settle'].resample('W').last()
s_volume =df_fut['Volume'].resample('W').last()
df_fut_weekly = pd.concat([s_settle,s_volume], axis=1)
display(df_fut_weekly.head())
| Settle | Volume | |
|---|---|---|
| Date | ||
| 2006-06-18 | 235.50 | 56486.0 |
| 2006-06-25 | 228.25 | 28361.0 |
| 2006-07-02 | 235.50 | 30519.0 |
| 2006-07-09 | 241.00 | 13057.0 |
| 2006-07-16 | 253.50 | 2460.0 |
df_cot=df_cot_orig.drop(columns=[clmn for i,clmn in enumerate(df_cot_orig.columns) if i not in [0,1,2,3 ]],axis=1)
df_cot.rename(index=str, columns={"Producer_Merchant_Processor_User_Longs": "Longs", \
"Producer_Merchant_Processor_User_Shorts": "Shorts"},inplace=True)
df_cot['Date'] = pd.to_datetime(df_cot['Date'])
df_cot.set_index('Date',inplace=True)
display(df_cot.head())
| Open_Interest | Longs | Shorts | |
|---|---|---|---|
| Date | |||
| 2018-07-10 | 1818055.0 | 500172.0 | 750062.0 |
| 2018-07-03 | 1830330.0 | 484257.0 | 773851.0 |
| 2018-06-26 | 1885804.0 | 513100.0 | 840177.0 |
| 2018-06-19 | 1992169.0 | 525197.0 | 920764.0 |
| 2018-06-12 | 1963233.0 | 488666.0 | 917204.0 |
s_longs =df_cot['Longs'].resample('W').last()
s_shorts =df_cot['Shorts'].resample('W').last()
s_open_interest =df_cot['Open_Interest'].resample('W').last()
df_cot_weekly = pd.concat([s_open_interest,s_longs, s_shorts], axis=1)
display(df_cot_weekly.head(5))
| Open_Interest | Longs | Shorts | |
|---|---|---|---|
| Date | |||
| 2006-06-18 | 1320155.0 | 209662.0 | 699163.0 |
| 2006-06-25 | 1321520.0 | 224476.0 | 666688.0 |
| 2006-07-02 | 1329400.0 | 234769.0 | 645735.0 |
| 2006-07-09 | 1327482.0 | 220552.0 | 648405.0 |
| 2006-07-16 | 1333225.0 | 216968.0 | 673110.0 |
df_weekly = pd.merge(df_fut_weekly,df_cot_weekly, on='Date')
display(df_weekly.head(5))
| Settle | Volume | Open_Interest | Longs | Shorts | |
|---|---|---|---|---|---|
| Date | |||||
| 2006-06-18 | 235.50 | 56486.0 | 1320155.0 | 209662.0 | 699163.0 |
| 2006-06-25 | 228.25 | 28361.0 | 1321520.0 | 224476.0 | 666688.0 |
| 2006-07-02 | 235.50 | 30519.0 | 1329400.0 | 234769.0 | 645735.0 |
| 2006-07-09 | 241.00 | 13057.0 | 1327482.0 | 220552.0 | 648405.0 |
| 2006-07-16 | 253.50 | 2460.0 | 1333225.0 | 216968.0 | 673110.0 |
# Display a description of the dataset
display(df_weekly.describe())
| Settle | Volume | Open_Interest | Longs | Shorts | |
|---|---|---|---|---|---|
| count | 631.000000 | 631.000000 | 6.310000e+02 | 631.000000 | 6.310000e+02 |
| mean | 456.978605 | 100835.204437 | 1.292201e+06 | 270795.049128 | 6.268425e+05 |
| std | 140.242112 | 72466.341538 | 2.095471e+05 | 68976.221600 | 1.554272e+05 |
| min | 219.750000 | 132.000000 | 7.482520e+05 | 102373.000000 | 2.972960e+05 |
| 25% | 359.500000 | 34822.500000 | 1.192226e+06 | 226595.000000 | 5.235930e+05 |
| 50% | 389.250000 | 101209.000000 | 1.301506e+06 | 262823.000000 | 6.112810e+05 |
| 75% | 560.375000 | 150341.000000 | 1.398275e+06 | 314224.000000 | 7.058555e+05 |
| max | 824.500000 | 369522.000000 | 1.992169e+06 | 525197.000000 | 1.001517e+06 |
# rest index since we need row numbers for splitting
df_weekly_idx_date=df_weekly.copy()
df_weekly.reset_index(inplace=True)
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_weekly_combined_series_by_date(df_weekly)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_weekly_combined_series_by_trading_week(df_weekly)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_grouped_by_year_data(df_weekly_idx_date,"Stacked Plots of Price by Year")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import visuals
visuals.lag_plot(df_weekly,"Lag Plot")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
scaler = MinMaxScaler(feature_range=(0, 1))
values = df_weekly.loc[:, df_weekly.columns != 'Date'].values
scaled = scaler.fit_transform(values)
validation_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2017-01-01')].index[0]
testing_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2018-01-01')].index[0]
print("validation start",validation_start)
print("testing start",testing_start)
validation start 550 testing start 603
# print data to double check
#print(df_weekly.iloc[validation_start])
#print(df_weekly.iloc[testing_start])
%load_ext autoreload
%autoreload 2
import data_preparer
reframed = data_preparer.series_to_supervised(scaled, 1, 1)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# drop columns we don't want to predict
reframed.drop(reframed.columns[[6,7,8,9]], axis=1, inplace=True)
display(reframed.head())
| var1(t-1) | var2(t-1) | var3(t-1) | var4(t-1) | var5(t-1) | var1(t) | |
|---|---|---|---|---|---|---|
| 1 | 0.026044 | 0.152560 | 0.459760 | 0.253744 | 0.570655 | 0.014055 |
| 2 | 0.014055 | 0.076421 | 0.460857 | 0.288780 | 0.524540 | 0.026044 |
| 3 | 0.026044 | 0.082263 | 0.467192 | 0.313123 | 0.494786 | 0.035138 |
| 4 | 0.035138 | 0.034990 | 0.465650 | 0.279499 | 0.498578 | 0.055808 |
| 5 | 0.055808 | 0.006302 | 0.470267 | 0.271023 | 0.533659 | 0.028938 |
%load_ext autoreload
%autoreload 2
import data_preparer
train_X, train_y, validation_X, validation_y,test_X, test_y = data_preparer.split_data(reframed,validation_start,testing_start)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
%load_ext autoreload
%autoreload 2
import models
model,history=models.basic_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Train on 550 samples, validate on 53 samples Epoch 1/500 - 14s - loss: 0.5592 - val_loss: 0.3919 Epoch 2/500 - 0s - loss: 0.5173 - val_loss: 0.3437 Epoch 3/500 - 0s - loss: 0.4764 - val_loss: 0.2974 Epoch 4/500 - 0s - loss: 0.4374 - val_loss: 0.2533 Epoch 5/500 - 0s - loss: 0.4009 - val_loss: 0.2120 Epoch 6/500 - 0s - loss: 0.3678 - val_loss: 0.1729 Epoch 7/500 - 0s - loss: 0.3363 - val_loss: 0.1351 Epoch 8/500 - 0s - loss: 0.3058 - val_loss: 0.0985 Epoch 9/500 - 0s - loss: 0.2770 - val_loss: 0.0638 Epoch 10/500 - 0s - loss: 0.2515 - val_loss: 0.0362 Epoch 11/500 - 0s - loss: 0.2331 - val_loss: 0.0267 Epoch 12/500 - 0s - loss: 0.2226 - val_loss: 0.0308 Epoch 13/500 - 0s - loss: 0.2165 - val_loss: 0.0382 Epoch 14/500 - 0s - loss: 0.2126 - val_loss: 0.0453 Epoch 15/500 - 0s - loss: 0.2100 - val_loss: 0.0516 Epoch 16/500 - 0s - loss: 0.2083 - val_loss: 0.0560 Epoch 17/500 - 0s - loss: 0.2071 - val_loss: 0.0595 Epoch 18/500 - 0s - loss: 0.2062 - val_loss: 0.0622 Epoch 19/500 - 0s - loss: 0.2054 - val_loss: 0.0641 Epoch 20/500 - 0s - loss: 0.2047 - val_loss: 0.0655 Epoch 21/500 - 0s - loss: 0.2041 - val_loss: 0.0665 Epoch 22/500 - 0s - loss: 0.2036 - val_loss: 0.0669 Epoch 23/500 - 0s - loss: 0.2031 - val_loss: 0.0670 Epoch 24/500 - 0s - loss: 0.2026 - val_loss: 0.0672 Epoch 25/500 - 0s - loss: 0.2021 - val_loss: 0.0674 Epoch 26/500 - 0s - loss: 0.2016 - val_loss: 0.0675 Epoch 27/500 - 0s - loss: 0.2012 - val_loss: 0.0676 Epoch 28/500 - 0s - loss: 0.2007 - val_loss: 0.0677 Epoch 29/500 - 0s - loss: 0.2002 - val_loss: 0.0678 Epoch 30/500 - 0s - loss: 0.1998 - val_loss: 0.0677 Epoch 31/500 - 0s - loss: 0.1994 - val_loss: 0.0675 Epoch 32/500 - 0s - loss: 0.1989 - val_loss: 0.0672 Epoch 33/500 - 0s - loss: 0.1985 - val_loss: 0.0669 Epoch 34/500 - 0s - loss: 0.1980 - val_loss: 0.0665 Epoch 35/500 - 0s - loss: 0.1976 - val_loss: 0.0662 Epoch 36/500 - 0s - loss: 0.1972 - val_loss: 0.0659 Epoch 37/500 - 0s - loss: 0.1967 - val_loss: 0.0655 Epoch 38/500 - 0s - loss: 0.1963 - val_loss: 0.0652 Epoch 39/500 - 0s - loss: 0.1959 - val_loss: 0.0648 Epoch 40/500 - 0s - loss: 0.1954 - val_loss: 0.0644 Epoch 41/500 - 0s - loss: 0.1950 - val_loss: 0.0639 Epoch 42/500 - 0s - loss: 0.1945 - val_loss: 0.0634 Epoch 43/500 - 0s - loss: 0.1941 - val_loss: 0.0630 Epoch 44/500 - 0s - loss: 0.1936 - val_loss: 0.0626 Epoch 45/500 - 0s - loss: 0.1931 - val_loss: 0.0621 Epoch 46/500 - 0s - loss: 0.1927 - val_loss: 0.0616 Epoch 47/500 - 0s - loss: 0.1922 - val_loss: 0.0612 Epoch 48/500 - 0s - loss: 0.1917 - val_loss: 0.0608 Epoch 49/500 - 0s - loss: 0.1912 - val_loss: 0.0604 Epoch 50/500 - 0s - loss: 0.1907 - val_loss: 0.0601 Epoch 51/500 - 0s - loss: 0.1902 - val_loss: 0.0598 Epoch 52/500 - 0s - loss: 0.1896 - val_loss: 0.0595 Epoch 53/500 - 0s - loss: 0.1891 - val_loss: 0.0592 Epoch 54/500 - 0s - loss: 0.1885 - val_loss: 0.0589 Epoch 55/500 - 0s - loss: 0.1879 - val_loss: 0.0587 Epoch 56/500 - 0s - loss: 0.1873 - val_loss: 0.0585 Epoch 57/500 - 0s - loss: 0.1867 - val_loss: 0.0583 Epoch 58/500 - 0s - loss: 0.1860 - val_loss: 0.0582 Epoch 59/500 - 0s - loss: 0.1854 - val_loss: 0.0582 Epoch 60/500 - 0s - loss: 0.1847 - val_loss: 0.0582 Epoch 61/500 - 0s - loss: 0.1840 - val_loss: 0.0582 Epoch 62/500 - 0s - loss: 0.1832 - val_loss: 0.0580 Epoch 63/500 - 0s - loss: 0.1825 - val_loss: 0.0578 Epoch 64/500 - 0s - loss: 0.1817 - val_loss: 0.0577 Epoch 65/500 - 0s - loss: 0.1808 - val_loss: 0.0577 Epoch 66/500 - 0s - loss: 0.1800 - val_loss: 0.0576 Epoch 67/500 - 0s - loss: 0.1791 - val_loss: 0.0574 Epoch 68/500 - 0s - loss: 0.1781 - val_loss: 0.0572 Epoch 69/500 - 0s - loss: 0.1772 - val_loss: 0.0570 Epoch 70/500 - 0s - loss: 0.1762 - val_loss: 0.0565 Epoch 71/500 - 0s - loss: 0.1752 - val_loss: 0.0560 Epoch 72/500 - 0s - loss: 0.1741 - val_loss: 0.0552 Epoch 73/500 - 0s - loss: 0.1730 - val_loss: 0.0546 Epoch 74/500 - 0s - loss: 0.1719 - val_loss: 0.0541 Epoch 75/500 - 0s - loss: 0.1707 - val_loss: 0.0534 Epoch 76/500 - 0s - loss: 0.1695 - val_loss: 0.0527 Epoch 77/500 - 0s - loss: 0.1683 - val_loss: 0.0522 Epoch 78/500 - 0s - loss: 0.1670 - val_loss: 0.0518 Epoch 79/500 - 0s - loss: 0.1656 - val_loss: 0.0515 Epoch 80/500 - 0s - loss: 0.1642 - val_loss: 0.0510 Epoch 81/500 - 0s - loss: 0.1628 - val_loss: 0.0505 Epoch 82/500 - 0s - loss: 0.1613 - val_loss: 0.0499 Epoch 83/500 - 0s - loss: 0.1597 - val_loss: 0.0495 Epoch 84/500 - 0s - loss: 0.1581 - val_loss: 0.0491 Epoch 85/500 - 0s - loss: 0.1564 - val_loss: 0.0489 Epoch 86/500 - 0s - loss: 0.1546 - val_loss: 0.0483 Epoch 87/500 - 0s - loss: 0.1528 - val_loss: 0.0477 Epoch 88/500 - 0s - loss: 0.1509 - val_loss: 0.0471 Epoch 89/500 - 0s - loss: 0.1489 - val_loss: 0.0469 Epoch 90/500 - 0s - loss: 0.1469 - val_loss: 0.0470 Epoch 91/500 - 0s - loss: 0.1447 - val_loss: 0.0471 Epoch 92/500 - 0s - loss: 0.1425 - val_loss: 0.0472 Epoch 93/500 - 0s - loss: 0.1402 - val_loss: 0.0470 Epoch 94/500 - 0s - loss: 0.1378 - val_loss: 0.0468 Epoch 95/500 - 0s - loss: 0.1353 - val_loss: 0.0466 Epoch 96/500 - 0s - loss: 0.1328 - val_loss: 0.0460 Epoch 97/500 - 0s - loss: 0.1303 - val_loss: 0.0456 Epoch 98/500 - 0s - loss: 0.1277 - val_loss: 0.0452 Epoch 99/500 - 0s - loss: 0.1250 - val_loss: 0.0446 Epoch 100/500 - 0s - loss: 0.1222 - val_loss: 0.0442 Epoch 101/500 - 0s - loss: 0.1193 - val_loss: 0.0440 Epoch 102/500 - 0s - loss: 0.1164 - val_loss: 0.0437 Epoch 103/500 - 0s - loss: 0.1134 - val_loss: 0.0435 Epoch 104/500 - 0s - loss: 0.1103 - val_loss: 0.0435 Epoch 105/500 - 0s - loss: 0.1071 - val_loss: 0.0438 Epoch 106/500 - 0s - loss: 0.1039 - val_loss: 0.0443 Epoch 107/500 - 0s - loss: 0.1007 - val_loss: 0.0448 Epoch 108/500 - 0s - loss: 0.0976 - val_loss: 0.0453 Epoch 109/500 - 0s - loss: 0.0945 - val_loss: 0.0459 Epoch 110/500 - 0s - loss: 0.0915 - val_loss: 0.0467 Epoch 111/500 - 0s - loss: 0.0886 - val_loss: 0.0477 Epoch 112/500 - 0s - loss: 0.0856 - val_loss: 0.0486 Epoch 113/500 - 0s - loss: 0.0827 - val_loss: 0.0500 Epoch 114/500 - 0s - loss: 0.0804 - val_loss: 0.0517 Epoch 115/500 - 0s - loss: 0.0783 - val_loss: 0.0534 Epoch 116/500 - 0s - loss: 0.0761 - val_loss: 0.0539 Epoch 117/500 - 0s - loss: 0.0747 - val_loss: 0.0551 Epoch 118/500 - 0s - loss: 0.0732 - val_loss: 0.0562 Epoch 119/500 - 0s - loss: 0.0717 - val_loss: 0.0570 Epoch 120/500 - 0s - loss: 0.0705 - val_loss: 0.0583 Epoch 121/500 - 0s - loss: 0.0693 - val_loss: 0.0590 Epoch 122/500 - 0s - loss: 0.0679 - val_loss: 0.0589 Epoch 123/500 - 0s - loss: 0.0670 - val_loss: 0.0597 Epoch 124/500 - 0s - loss: 0.0660 - val_loss: 0.0602 Epoch 125/500 - 0s - loss: 0.0649 - val_loss: 0.0599 Epoch 126/500 - 0s - loss: 0.0640 - val_loss: 0.0601 Epoch 127/500 - 0s - loss: 0.0630 - val_loss: 0.0599 Epoch 128/500 - 0s - loss: 0.0621 - val_loss: 0.0597 Epoch 129/500 - 0s - loss: 0.0614 - val_loss: 0.0594 Epoch 130/500 - 0s - loss: 0.0606 - val_loss: 0.0593 Epoch 131/500 - 0s - loss: 0.0598 - val_loss: 0.0592 Epoch 132/500 - 0s - loss: 0.0590 - val_loss: 0.0587 Epoch 133/500 - 0s - loss: 0.0583 - val_loss: 0.0588 Epoch 134/500 - 0s - loss: 0.0576 - val_loss: 0.0586 Epoch 135/500 - 0s - loss: 0.0568 - val_loss: 0.0582 Epoch 136/500 - 0s - loss: 0.0561 - val_loss: 0.0579 Epoch 137/500 - 0s - loss: 0.0554 - val_loss: 0.0573 Epoch 138/500 - 0s - loss: 0.0548 - val_loss: 0.0567 Epoch 139/500 - 0s - loss: 0.0542 - val_loss: 0.0563 Epoch 140/500 - 0s - loss: 0.0536 - val_loss: 0.0562 Epoch 141/500 - 0s - loss: 0.0529 - val_loss: 0.0552 Epoch 142/500 - 0s - loss: 0.0524 - val_loss: 0.0547 Epoch 143/500 - 0s - loss: 0.0518 - val_loss: 0.0544 Epoch 144/500 - 0s - loss: 0.0512 - val_loss: 0.0539 Epoch 145/500 - 0s - loss: 0.0507 - val_loss: 0.0535 Epoch 146/500 - 0s - loss: 0.0501 - val_loss: 0.0527 Epoch 147/500 - 0s - loss: 0.0496 - val_loss: 0.0520 Epoch 148/500 - 0s - loss: 0.0491 - val_loss: 0.0516 Epoch 149/500 - 0s - loss: 0.0485 - val_loss: 0.0511 Epoch 150/500 - 0s - loss: 0.0480 - val_loss: 0.0505 Epoch 151/500 - 0s - loss: 0.0476 - val_loss: 0.0502 Epoch 152/500 - 0s - loss: 0.0471 - val_loss: 0.0496 Epoch 153/500 - 0s - loss: 0.0466 - val_loss: 0.0488 Epoch 154/500 - 0s - loss: 0.0461 - val_loss: 0.0483 Epoch 155/500 - 0s - loss: 0.0457 - val_loss: 0.0474 Epoch 156/500 - 0s - loss: 0.0453 - val_loss: 0.0467 Epoch 157/500 - 0s - loss: 0.0449 - val_loss: 0.0461 Epoch 158/500 - 0s - loss: 0.0445 - val_loss: 0.0451 Epoch 159/500 - 0s - loss: 0.0441 - val_loss: 0.0445 Epoch 160/500 - 0s - loss: 0.0437 - val_loss: 0.0438 Epoch 161/500 - 0s - loss: 0.0433 - val_loss: 0.0429 Epoch 162/500 - 0s - loss: 0.0430 - val_loss: 0.0423 Epoch 163/500 - 0s - loss: 0.0426 - val_loss: 0.0415 Epoch 164/500 - 0s - loss: 0.0422 - val_loss: 0.0411 Epoch 165/500 - 0s - loss: 0.0419 - val_loss: 0.0401 Epoch 166/500 - 0s - loss: 0.0416 - val_loss: 0.0394 Epoch 167/500 - 0s - loss: 0.0413 - val_loss: 0.0388 Epoch 168/500 - 0s - loss: 0.0410 - val_loss: 0.0381 Epoch 169/500 - 0s - loss: 0.0407 - val_loss: 0.0377 Epoch 170/500 - 0s - loss: 0.0404 - val_loss: 0.0374 Epoch 171/500 - 0s - loss: 0.0401 - val_loss: 0.0365 Epoch 172/500 - 0s - loss: 0.0399 - val_loss: 0.0361 Epoch 173/500 - 0s - loss: 0.0396 - val_loss: 0.0359 Epoch 174/500 - 0s - loss: 0.0394 - val_loss: 0.0352 Epoch 175/500 - 0s - loss: 0.0392 - val_loss: 0.0348 Epoch 176/500 - 0s - loss: 0.0389 - val_loss: 0.0344 Epoch 177/500 - 0s - loss: 0.0387 - val_loss: 0.0336 Epoch 178/500 - 0s - loss: 0.0385 - val_loss: 0.0331 Epoch 179/500 - 0s - loss: 0.0383 - val_loss: 0.0324 Epoch 180/500 - 0s - loss: 0.0381 - val_loss: 0.0322 Epoch 181/500 - 0s - loss: 0.0379 - val_loss: 0.0309 Epoch 182/500 - 0s - loss: 0.0377 - val_loss: 0.0312 Epoch 183/500 - 0s - loss: 0.0375 - val_loss: 0.0302 Epoch 184/500 - 0s - loss: 0.0373 - val_loss: 0.0306 Epoch 185/500 - 0s - loss: 0.0371 - val_loss: 0.0298 Epoch 186/500 - 0s - loss: 0.0370 - val_loss: 0.0297 Epoch 187/500 - 0s - loss: 0.0369 - val_loss: 0.0295 Epoch 188/500 - 0s - loss: 0.0367 - val_loss: 0.0291 Epoch 189/500 - 0s - loss: 0.0366 - val_loss: 0.0285 Epoch 190/500 - 0s - loss: 0.0364 - val_loss: 0.0282 Epoch 191/500 - 0s - loss: 0.0364 - val_loss: 0.0280 Epoch 192/500 - 0s - loss: 0.0362 - val_loss: 0.0279 Epoch 193/500 - 0s - loss: 0.0361 - val_loss: 0.0267 Epoch 194/500 - 0s - loss: 0.0360 - val_loss: 0.0271 Epoch 195/500 - 0s - loss: 0.0359 - val_loss: 0.0263 Epoch 196/500 - 0s - loss: 0.0357 - val_loss: 0.0260 Epoch 197/500 - 0s - loss: 0.0356 - val_loss: 0.0259 Epoch 198/500 - 0s - loss: 0.0355 - val_loss: 0.0255 Epoch 199/500 - 0s - loss: 0.0354 - val_loss: 0.0252 Epoch 200/500 - 0s - loss: 0.0353 - val_loss: 0.0247 Epoch 201/500 - 0s - loss: 0.0352 - val_loss: 0.0244 Epoch 202/500 - 0s - loss: 0.0351 - val_loss: 0.0242 Epoch 203/500 - 0s - loss: 0.0350 - val_loss: 0.0237 Epoch 204/500 - 0s - loss: 0.0349 - val_loss: 0.0235 Epoch 205/500 - 0s - loss: 0.0348 - val_loss: 0.0233 Epoch 206/500 - 0s - loss: 0.0346 - val_loss: 0.0227 Epoch 207/500 - 0s - loss: 0.0345 - val_loss: 0.0223 Epoch 208/500 - 0s - loss: 0.0344 - val_loss: 0.0221 Epoch 209/500 - 0s - loss: 0.0343 - val_loss: 0.0219 Epoch 210/500 - 0s - loss: 0.0343 - val_loss: 0.0217 Epoch 211/500 - 0s - loss: 0.0342 - val_loss: 0.0214 Epoch 212/500 - 0s - loss: 0.0341 - val_loss: 0.0213 Epoch 213/500 - 0s - loss: 0.0340 - val_loss: 0.0209 Epoch 214/500 - 0s - loss: 0.0339 - val_loss: 0.0207 Epoch 215/500 - 0s - loss: 0.0338 - val_loss: 0.0207 Epoch 216/500 - 0s - loss: 0.0337 - val_loss: 0.0204 Epoch 217/500 - 0s - loss: 0.0337 - val_loss: 0.0202 Epoch 218/500 - 0s - loss: 0.0336 - val_loss: 0.0201 Epoch 219/500 - 0s - loss: 0.0335 - val_loss: 0.0199 Epoch 220/500 - 0s - loss: 0.0334 - val_loss: 0.0196 Epoch 221/500 - 0s - loss: 0.0334 - val_loss: 0.0195 Epoch 222/500 - 0s - loss: 0.0333 - val_loss: 0.0194 Epoch 223/500 - 0s - loss: 0.0332 - val_loss: 0.0193 Epoch 224/500 - 0s - loss: 0.0332 - val_loss: 0.0194 Epoch 225/500 - 0s - loss: 0.0331 - val_loss: 0.0192 Epoch 226/500 - 0s - loss: 0.0330 - val_loss: 0.0191 Epoch 227/500 - 0s - loss: 0.0329 - val_loss: 0.0190 Epoch 228/500 - 0s - loss: 0.0329 - val_loss: 0.0189 Epoch 229/500 - 0s - loss: 0.0328 - val_loss: 0.0189 Epoch 230/500 - 0s - loss: 0.0327 - val_loss: 0.0187 Epoch 231/500 - 0s - loss: 0.0327 - val_loss: 0.0188 Epoch 232/500 - 0s - loss: 0.0326 - val_loss: 0.0185 Epoch 233/500 - 0s - loss: 0.0326 - val_loss: 0.0187 Epoch 234/500 - 0s - loss: 0.0325 - val_loss: 0.0183 Epoch 235/500 - 0s - loss: 0.0325 - val_loss: 0.0184 Epoch 236/500 - 0s - loss: 0.0325 - val_loss: 0.0183 Epoch 237/500 - 0s - loss: 0.0324 - val_loss: 0.0182 Epoch 238/500 - 0s - loss: 0.0324 - val_loss: 0.0180 Epoch 239/500 - 0s - loss: 0.0323 - val_loss: 0.0184 Epoch 240/500 - 0s - loss: 0.0323 - val_loss: 0.0178 Epoch 241/500 - 0s - loss: 0.0322 - val_loss: 0.0181 Epoch 242/500 - 0s - loss: 0.0321 - val_loss: 0.0177 Epoch 243/500 - 0s - loss: 0.0321 - val_loss: 0.0178 Epoch 244/500 - 0s - loss: 0.0321 - val_loss: 0.0178 Epoch 245/500 - 0s - loss: 0.0320 - val_loss: 0.0178 Epoch 246/500 - 0s - loss: 0.0320 - val_loss: 0.0174 Epoch 247/500 - 0s - loss: 0.0319 - val_loss: 0.0179 Epoch 248/500 - 0s - loss: 0.0319 - val_loss: 0.0173 Epoch 249/500 - 0s - loss: 0.0318 - val_loss: 0.0174 Epoch 250/500 - 0s - loss: 0.0318 - val_loss: 0.0172 Epoch 251/500 - 0s - loss: 0.0317 - val_loss: 0.0174 Epoch 252/500 - 0s - loss: 0.0317 - val_loss: 0.0172 Epoch 253/500 - 0s - loss: 0.0316 - val_loss: 0.0170 Epoch 254/500 - 0s - loss: 0.0316 - val_loss: 0.0170 Epoch 255/500 - 0s - loss: 0.0316 - val_loss: 0.0171 Epoch 256/500 - 0s - loss: 0.0314 - val_loss: 0.0168 Epoch 257/500 - 0s - loss: 0.0314 - val_loss: 0.0168 Epoch 258/500 - 0s - loss: 0.0314 - val_loss: 0.0168 Epoch 259/500 - 0s - loss: 0.0313 - val_loss: 0.0166 Epoch 260/500 - 0s - loss: 0.0313 - val_loss: 0.0166 Epoch 261/500 - 0s - loss: 0.0312 - val_loss: 0.0166 Epoch 262/500 - 0s - loss: 0.0312 - val_loss: 0.0165 Epoch 263/500 - 0s - loss: 0.0312 - val_loss: 0.0166 Epoch 264/500 - 0s - loss: 0.0311 - val_loss: 0.0164 Epoch 265/500 - 0s - loss: 0.0310 - val_loss: 0.0162 Epoch 266/500 - 0s - loss: 0.0310 - val_loss: 0.0167 Epoch 267/500 - 0s - loss: 0.0310 - val_loss: 0.0162 Epoch 268/500 - 0s - loss: 0.0310 - val_loss: 0.0163 Epoch 269/500 - 0s - loss: 0.0309 - val_loss: 0.0159 Epoch 270/500 - 0s - loss: 0.0309 - val_loss: 0.0161 Epoch 271/500 - 0s - loss: 0.0309 - val_loss: 0.0161 Epoch 272/500 - 0s - loss: 0.0308 - val_loss: 0.0160 Epoch 273/500 - 0s - loss: 0.0307 - val_loss: 0.0156 Epoch 274/500 - 0s - loss: 0.0307 - val_loss: 0.0164 Epoch 275/500 - 0s - loss: 0.0307 - val_loss: 0.0159 Epoch 276/500 - 0s - loss: 0.0306 - val_loss: 0.0157 Epoch 277/500 - 0s - loss: 0.0306 - val_loss: 0.0161 Epoch 278/500 - 0s - loss: 0.0306 - val_loss: 0.0160 Epoch 279/500 - 0s - loss: 0.0305 - val_loss: 0.0159 Epoch 280/500 - 0s - loss: 0.0304 - val_loss: 0.0156 Epoch 281/500 - 0s - loss: 0.0304 - val_loss: 0.0160 Epoch 282/500 - 0s - loss: 0.0304 - val_loss: 0.0158 Epoch 283/500 - 0s - loss: 0.0303 - val_loss: 0.0157 Epoch 284/500 - 0s - loss: 0.0303 - val_loss: 0.0158 Epoch 285/500 - 0s - loss: 0.0303 - val_loss: 0.0158 Epoch 286/500 - 0s - loss: 0.0303 - val_loss: 0.0156 Epoch 287/500 - 0s - loss: 0.0302 - val_loss: 0.0157 Epoch 288/500 - 0s - loss: 0.0302 - val_loss: 0.0156 Epoch 289/500 - 0s - loss: 0.0301 - val_loss: 0.0155 Epoch 290/500 - 0s - loss: 0.0302 - val_loss: 0.0156 Epoch 291/500 - 0s - loss: 0.0300 - val_loss: 0.0154 Epoch 292/500 - 0s - loss: 0.0301 - val_loss: 0.0157 Epoch 293/500 - 0s - loss: 0.0300 - val_loss: 0.0154 Epoch 294/500 - 0s - loss: 0.0300 - val_loss: 0.0155 Epoch 295/500 - 0s - loss: 0.0299 - val_loss: 0.0153 Epoch 296/500 - 0s - loss: 0.0299 - val_loss: 0.0154 Epoch 297/500 - 0s - loss: 0.0298 - val_loss: 0.0152 Epoch 298/500 - 0s - loss: 0.0299 - val_loss: 0.0153 Epoch 299/500 - 0s - loss: 0.0298 - val_loss: 0.0152 Epoch 300/500 - 0s - loss: 0.0298 - val_loss: 0.0153 Epoch 301/500 - 0s - loss: 0.0297 - val_loss: 0.0153 Epoch 302/500 - 0s - loss: 0.0297 - val_loss: 0.0152 Epoch 303/500 - 0s - loss: 0.0297 - val_loss: 0.0151 Epoch 304/500 - 0s - loss: 0.0296 - val_loss: 0.0151 Epoch 305/500 - 0s - loss: 0.0296 - val_loss: 0.0152 Epoch 306/500 - 0s - loss: 0.0296 - val_loss: 0.0151 Epoch 307/500 - 0s - loss: 0.0295 - val_loss: 0.0151 Epoch 308/500 - 0s - loss: 0.0295 - val_loss: 0.0150 Epoch 309/500 - 0s - loss: 0.0295 - val_loss: 0.0151 Epoch 310/500 - 0s - loss: 0.0294 - val_loss: 0.0150 Epoch 311/500 - 0s - loss: 0.0294 - val_loss: 0.0151 Epoch 312/500 - 0s - loss: 0.0293 - val_loss: 0.0149 Epoch 313/500 - 0s - loss: 0.0293 - val_loss: 0.0151 Epoch 314/500 - 0s - loss: 0.0293 - val_loss: 0.0150 Epoch 315/500 - 0s - loss: 0.0293 - val_loss: 0.0150 Epoch 316/500 - 0s - loss: 0.0292 - val_loss: 0.0150 Epoch 317/500 - 0s - loss: 0.0292 - val_loss: 0.0150 Epoch 318/500 - 0s - loss: 0.0291 - val_loss: 0.0150 Epoch 319/500 - 0s - loss: 0.0292 - val_loss: 0.0149 Epoch 320/500 - 0s - loss: 0.0291 - val_loss: 0.0149 Epoch 321/500 - 0s - loss: 0.0290 - val_loss: 0.0150 Epoch 322/500 - 0s - loss: 0.0291 - val_loss: 0.0149 Epoch 323/500 - 0s - loss: 0.0290 - val_loss: 0.0148 Epoch 324/500 - 0s - loss: 0.0290 - val_loss: 0.0149 Epoch 325/500 - 0s - loss: 0.0290 - val_loss: 0.0148 Epoch 326/500 - 0s - loss: 0.0289 - val_loss: 0.0148 Epoch 327/500 - 0s - loss: 0.0289 - val_loss: 0.0148 Epoch 328/500 - 0s - loss: 0.0289 - val_loss: 0.0149 Epoch 329/500 - 0s - loss: 0.0288 - val_loss: 0.0148 Epoch 330/500 - 0s - loss: 0.0288 - val_loss: 0.0148 Epoch 331/500 - 0s - loss: 0.0288 - val_loss: 0.0148 Epoch 332/500 - 0s - loss: 0.0288 - val_loss: 0.0150 Epoch 333/500 - 0s - loss: 0.0287 - val_loss: 0.0148 Epoch 334/500 - 0s - loss: 0.0287 - val_loss: 0.0148 Epoch 335/500 - 0s - loss: 0.0287 - val_loss: 0.0148 Epoch 336/500 - 0s - loss: 0.0286 - val_loss: 0.0149 Epoch 337/500 - 0s - loss: 0.0286 - val_loss: 0.0147 Epoch 338/500 - 0s - loss: 0.0286 - val_loss: 0.0148 Epoch 339/500 - 0s - loss: 0.0286 - val_loss: 0.0147 Epoch 340/500 - 0s - loss: 0.0286 - val_loss: 0.0147 Epoch 341/500 - 0s - loss: 0.0285 - val_loss: 0.0147 Epoch 342/500 - 0s - loss: 0.0285 - val_loss: 0.0146 Epoch 343/500 - 0s - loss: 0.0285 - val_loss: 0.0146 Epoch 344/500 - 0s - loss: 0.0285 - val_loss: 0.0145 Epoch 345/500 - 0s - loss: 0.0285 - val_loss: 0.0146 Epoch 346/500 - 0s - loss: 0.0284 - val_loss: 0.0146 Epoch 347/500 - 0s - loss: 0.0284 - val_loss: 0.0146 Epoch 348/500 - 0s - loss: 0.0284 - val_loss: 0.0145 Epoch 349/500 - 0s - loss: 0.0284 - val_loss: 0.0146 Epoch 350/500 - 0s - loss: 0.0283 - val_loss: 0.0146 Epoch 351/500 - 0s - loss: 0.0283 - val_loss: 0.0145 Epoch 352/500 - 0s - loss: 0.0283 - val_loss: 0.0144 Epoch 353/500 - 0s - loss: 0.0283 - val_loss: 0.0144 Epoch 354/500 - 0s - loss: 0.0283 - val_loss: 0.0145 Epoch 355/500 - 0s - loss: 0.0283 - val_loss: 0.0145 Epoch 356/500 - 0s - loss: 0.0282 - val_loss: 0.0144 Epoch 357/500 - 0s - loss: 0.0282 - val_loss: 0.0144 Epoch 358/500 - 0s - loss: 0.0282 - val_loss: 0.0143 Epoch 359/500 - 0s - loss: 0.0282 - val_loss: 0.0144 Epoch 360/500 - 0s - loss: 0.0282 - val_loss: 0.0143 Epoch 361/500 - 0s - loss: 0.0282 - val_loss: 0.0144 Epoch 362/500 - 0s - loss: 0.0281 - val_loss: 0.0143 Epoch 363/500 - 0s - loss: 0.0281 - val_loss: 0.0143 Epoch 364/500 - 0s - loss: 0.0281 - val_loss: 0.0143 Epoch 365/500 - 0s - loss: 0.0281 - val_loss: 0.0142 Epoch 366/500 - 0s - loss: 0.0281 - val_loss: 0.0142 Epoch 367/500 - 0s - loss: 0.0281 - val_loss: 0.0142 Epoch 368/500 - 0s - loss: 0.0280 - val_loss: 0.0141 Epoch 369/500 - 0s - loss: 0.0281 - val_loss: 0.0144 Epoch 370/500 - 0s - loss: 0.0280 - val_loss: 0.0141 Epoch 371/500 - 0s - loss: 0.0280 - val_loss: 0.0141 Epoch 372/500 - 0s - loss: 0.0280 - val_loss: 0.0141 Epoch 373/500 - 0s - loss: 0.0279 - val_loss: 0.0140 Epoch 374/500 - 0s - loss: 0.0279 - val_loss: 0.0141 Epoch 375/500 - 0s - loss: 0.0279 - val_loss: 0.0141 Epoch 376/500 - 0s - loss: 0.0279 - val_loss: 0.0140 Epoch 377/500 - 0s - loss: 0.0279 - val_loss: 0.0142 Epoch 378/500 - 0s - loss: 0.0279 - val_loss: 0.0143 Epoch 379/500 - 0s - loss: 0.0278 - val_loss: 0.0139 Epoch 380/500 - 0s - loss: 0.0279 - val_loss: 0.0143 Epoch 381/500 - 0s - loss: 0.0278 - val_loss: 0.0141 Epoch 382/500 - 0s - loss: 0.0278 - val_loss: 0.0139 Epoch 383/500 - 0s - loss: 0.0278 - val_loss: 0.0142 Epoch 384/500 - 0s - loss: 0.0278 - val_loss: 0.0141 Epoch 385/500 - 0s - loss: 0.0277 - val_loss: 0.0140 Epoch 386/500 - 0s - loss: 0.0277 - val_loss: 0.0139 Epoch 387/500 - 0s - loss: 0.0277 - val_loss: 0.0143 Epoch 388/500 - 0s - loss: 0.0277 - val_loss: 0.0142 Epoch 389/500 - 0s - loss: 0.0277 - val_loss: 0.0139 Epoch 390/500 - 0s - loss: 0.0277 - val_loss: 0.0139 Epoch 391/500 - 0s - loss: 0.0277 - val_loss: 0.0141 Epoch 392/500 - 0s - loss: 0.0277 - val_loss: 0.0142 Epoch 393/500 - 0s - loss: 0.0276 - val_loss: 0.0139 Epoch 394/500 - 0s - loss: 0.0276 - val_loss: 0.0140 Epoch 395/500 - 0s - loss: 0.0276 - val_loss: 0.0142 Epoch 396/500 - 0s - loss: 0.0276 - val_loss: 0.0143 Epoch 397/500 - 0s - loss: 0.0276 - val_loss: 0.0140 Epoch 398/500 - 0s - loss: 0.0276 - val_loss: 0.0140 Epoch 399/500 - 0s - loss: 0.0276 - val_loss: 0.0139 Epoch 400/500 - 0s - loss: 0.0275 - val_loss: 0.0140 Epoch 401/500 - 0s - loss: 0.0275 - val_loss: 0.0140 Epoch 402/500 - 0s - loss: 0.0275 - val_loss: 0.0139 Epoch 403/500 - 0s - loss: 0.0275 - val_loss: 0.0140 Epoch 404/500 - 0s - loss: 0.0275 - val_loss: 0.0138 Epoch 405/500 - 0s - loss: 0.0275 - val_loss: 0.0141 Epoch 406/500 - 0s - loss: 0.0275 - val_loss: 0.0141 Epoch 407/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 408/500 - 0s - loss: 0.0274 - val_loss: 0.0137 Epoch 409/500 - 0s - loss: 0.0275 - val_loss: 0.0144 Epoch 410/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 411/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 412/500 - 0s - loss: 0.0274 - val_loss: 0.0138 Epoch 413/500 - 0s - loss: 0.0274 - val_loss: 0.0140 Epoch 414/500 - 0s - loss: 0.0274 - val_loss: 0.0138 Epoch 415/500 - 0s - loss: 0.0274 - val_loss: 0.0138 Epoch 416/500 - 0s - loss: 0.0274 - val_loss: 0.0138 Epoch 417/500 - 0s - loss: 0.0274 - val_loss: 0.0140 Epoch 418/500 - 0s - loss: 0.0273 - val_loss: 0.0138 Epoch 419/500 - 0s - loss: 0.0273 - val_loss: 0.0138 Epoch 420/500 - 0s - loss: 0.0273 - val_loss: 0.0137 Epoch 421/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 422/500 - 0s - loss: 0.0273 - val_loss: 0.0137 Epoch 423/500 - 0s - loss: 0.0273 - val_loss: 0.0137 Epoch 424/500 - 0s - loss: 0.0273 - val_loss: 0.0137 Epoch 425/500 - 0s - loss: 0.0273 - val_loss: 0.0138 Epoch 426/500 - 0s - loss: 0.0273 - val_loss: 0.0137 Epoch 427/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 428/500 - 0s - loss: 0.0273 - val_loss: 0.0139 Epoch 429/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 430/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 431/500 - 0s - loss: 0.0272 - val_loss: 0.0136 Epoch 432/500 - 0s - loss: 0.0272 - val_loss: 0.0136 Epoch 433/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 434/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 435/500 - 0s - loss: 0.0272 - val_loss: 0.0135 Epoch 436/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 437/500 - 0s - loss: 0.0272 - val_loss: 0.0136 Epoch 438/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 439/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 440/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 441/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 442/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 443/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 444/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 445/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 446/500 - 0s - loss: 0.0271 - val_loss: 0.0135 Epoch 447/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 448/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 449/500 - 0s - loss: 0.0271 - val_loss: 0.0135 Epoch 450/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 451/500 - 0s - loss: 0.0271 - val_loss: 0.0135 Epoch 452/500 - 0s - loss: 0.0271 - val_loss: 0.0134 Epoch 453/500 - 0s - loss: 0.0271 - val_loss: 0.0134 Epoch 454/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 455/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 456/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 457/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 458/500 - 0s - loss: 0.0270 - val_loss: 0.0133 Epoch 459/500 - 0s - loss: 0.0270 - val_loss: 0.0135 Epoch 460/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 461/500 - 0s - loss: 0.0270 - val_loss: 0.0133 Epoch 462/500 - 0s - loss: 0.0270 - val_loss: 0.0133 Epoch 463/500 - 0s - loss: 0.0270 - val_loss: 0.0133 Epoch 464/500 - 0s - loss: 0.0270 - val_loss: 0.0135 Epoch 465/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 466/500 - 0s - loss: 0.0269 - val_loss: 0.0133 Epoch 467/500 - 0s - loss: 0.0270 - val_loss: 0.0132 Epoch 468/500 - 0s - loss: 0.0270 - val_loss: 0.0134 Epoch 469/500 - 0s - loss: 0.0269 - val_loss: 0.0132 Epoch 470/500 - 0s - loss: 0.0269 - val_loss: 0.0133 Epoch 471/500 - 0s - loss: 0.0269 - val_loss: 0.0132 Epoch 472/500 - 0s - loss: 0.0269 - val_loss: 0.0133 Epoch 473/500 - 0s - loss: 0.0269 - val_loss: 0.0132 Epoch 474/500 - 0s - loss: 0.0269 - val_loss: 0.0132 Epoch 475/500 - 0s - loss: 0.0269 - val_loss: 0.0132 Epoch 476/500 - 0s - loss: 0.0269 - val_loss: 0.0131 Epoch 477/500 - 0s - loss: 0.0269 - val_loss: 0.0130 Epoch 478/500 - 0s - loss: 0.0269 - val_loss: 0.0131 Epoch 479/500 - 0s - loss: 0.0269 - val_loss: 0.0131 Epoch 480/500 - 0s - loss: 0.0269 - val_loss: 0.0131 Epoch 481/500 - 0s - loss: 0.0268 - val_loss: 0.0132 Epoch 482/500 - 0s - loss: 0.0268 - val_loss: 0.0131 Epoch 483/500 - 0s - loss: 0.0268 - val_loss: 0.0131 Epoch 484/500 - 0s - loss: 0.0268 - val_loss: 0.0130 Epoch 485/500 - 0s - loss: 0.0268 - val_loss: 0.0131 Epoch 486/500 - 0s - loss: 0.0268 - val_loss: 0.0131 Epoch 487/500 - 0s - loss: 0.0268 - val_loss: 0.0129 Epoch 488/500 - 0s - loss: 0.0268 - val_loss: 0.0131 Epoch 489/500 - 0s - loss: 0.0268 - val_loss: 0.0129 Epoch 490/500 - 0s - loss: 0.0268 - val_loss: 0.0129 Epoch 491/500 - 0s - loss: 0.0268 - val_loss: 0.0130 Epoch 492/500 - 0s - loss: 0.0268 - val_loss: 0.0130 Epoch 493/500 - 0s - loss: 0.0267 - val_loss: 0.0129 Epoch 494/500 - 0s - loss: 0.0267 - val_loss: 0.0129 Epoch 495/500 - 0s - loss: 0.0268 - val_loss: 0.0129 Epoch 496/500 - 0s - loss: 0.0268 - val_loss: 0.0129 Epoch 497/500 - 0s - loss: 0.0267 - val_loss: 0.0129 Epoch 498/500 - 0s - loss: 0.0267 - val_loss: 0.0129 Epoch 499/500 - 0s - loss: 0.0267 - val_loss: 0.0128 Epoch 500/500 - 0s - loss: 0.0267 - val_loss: 0.0130
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Model on Validation Data RMSE: 9.479
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_series_to_compare(inv_y,inv_yhat,"Actual Price","Predicted Price", "Actual Price Versus LSTM Predicted Price")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
In this section we will check our bench mark model. As is proposed in my proposal my bench mark model is a simple linear regressor model.
from pandas import read_csv
from pandas import datetime
from pandas import DataFrame
from pandas import concat
from matplotlib import pyplot
from sklearn.metrics import mean_squared_error
from math import sqrt
# Create lagged dataset
values = pd.DataFrame(df_weekly["Settle"].values)
df_benchmark = concat([values.shift(1), values], axis=1)
df_benchmark.columns = ['t', 't+1']
display(df_benchmark.head(5))
| t | t+1 | |
|---|---|---|
| 0 | NaN | 235.50 |
| 1 | 235.50 | 228.25 |
| 2 | 228.25 | 235.50 |
| 3 | 235.50 | 241.00 |
| 4 | 241.00 | 253.50 |
# split into train , validation and test sets
X = df_benchmark.values
train, validation, test = X[1:validation_start], X[validation_start:testing_start],X[testing_start:]
train_bench_X, train_bench_y = train[:,0], train[:,1]
validation_bench_X, validation_bench_y = validation[:,0], validation[:,1]
test_bench_X, test_bench_y = test[:,0], test[:,1]
%load_ext autoreload
%autoreload 2
import models
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(validation_bench_X,validation_bench_y)
print('Benchmark Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Benchmark Model on Validation Data RMSE: 8.750
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_series_to_compare(validation_bench_y,predictions,"Actual Price","Predicted Price", "Actual Price Versus Benchmark Model Predicted Price")
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Moddel on Test Data RMSE: 12.079
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(test_bench_X,test_bench_y)
print('Benchmark Model on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Benchmark Model on Test Data RMSE: 8.293
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_memmory_cells(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=1.000000, loss=0.012129
>2/5 param=1.000000, loss=0.014510
>3/5 param=1.000000, loss=0.011251
>4/5 param=1.000000, loss=0.013165
>5/5 param=1.000000, loss=0.012103
>1/5 param=5.000000, loss=0.011613
>2/5 param=5.000000, loss=0.012066
>3/5 param=5.000000, loss=0.011987
>4/5 param=5.000000, loss=0.012024
>5/5 param=5.000000, loss=0.012577
>1/5 param=10.000000, loss=0.012330
>2/5 param=10.000000, loss=0.013115
>3/5 param=10.000000, loss=0.013052
>4/5 param=10.000000, loss=0.011792
>5/5 param=10.000000, loss=0.013219
>1/5 param=25.000000, loss=0.011451
>2/5 param=25.000000, loss=0.013046
>3/5 param=25.000000, loss=0.011217
>4/5 param=25.000000, loss=0.011381
>5/5 param=25.000000, loss=0.011058
>1/5 param=50.000000, loss=0.012644
>2/5 param=50.000000, loss=0.012646
>3/5 param=50.000000, loss=0.011140
>4/5 param=50.000000, loss=0.013345
>5/5 param=50.000000, loss=0.012515
>1/5 param=100.000000, loss=0.011604
>2/5 param=100.000000, loss=0.015141
>3/5 param=100.000000, loss=0.012493
>4/5 param=100.000000, loss=0.012222
>5/5 param=100.000000, loss=0.013316
>1/5 param=200.000000, loss=0.011432
>2/5 param=200.000000, loss=0.012943
>3/5 param=200.000000, loss=0.010766
>4/5 param=200.000000, loss=0.014985
>5/5 param=200.000000, loss=0.011893
1 5 10 25 50 100 200
count 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000
mean 0.012632 0.012053 0.012702 0.011631 0.012458 0.012955 0.012404
std 0.001250 0.000344 0.000618 0.000806 0.000805 0.001368 0.001646
min 0.011251 0.011613 0.011792 0.011058 0.011140 0.011604 0.010766
25% 0.012103 0.011987 0.012330 0.011217 0.012515 0.012222 0.011432
50% 0.012129 0.012024 0.013052 0.011381 0.012644 0.012493 0.011893
75% 0.013165 0.012066 0.013115 0.011451 0.012646 0.013316 0.012943
max 0.014510 0.012577 0.013219 0.013046 0.013345 0.015141 0.014985
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_batch_size(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=2.000000, loss=0.017062
>2/5 param=2.000000, loss=0.017288
>3/5 param=2.000000, loss=0.019628
>4/5 param=2.000000, loss=0.017816
>5/5 param=2.000000, loss=0.019510
>1/5 param=4.000000, loss=0.012946
>2/5 param=4.000000, loss=0.013468
>3/5 param=4.000000, loss=0.012065
>4/5 param=4.000000, loss=0.012588
>5/5 param=4.000000, loss=0.013342
>1/5 param=8.000000, loss=0.014913
>2/5 param=8.000000, loss=0.016104
>3/5 param=8.000000, loss=0.015724
>4/5 param=8.000000, loss=0.015701
>5/5 param=8.000000, loss=0.014112
>1/5 param=32.000000, loss=0.011369
>2/5 param=32.000000, loss=0.011775
>3/5 param=32.000000, loss=0.012824
>4/5 param=32.000000, loss=0.012704
>5/5 param=32.000000, loss=0.011133
>1/5 param=64.000000, loss=0.011609
>2/5 param=64.000000, loss=0.011532
>3/5 param=64.000000, loss=0.013435
>4/5 param=64.000000, loss=0.011951
>5/5 param=64.000000, loss=0.012349
>1/5 param=128.000000, loss=0.011928
>2/5 param=128.000000, loss=0.012988
>3/5 param=128.000000, loss=0.011940
>4/5 param=128.000000, loss=0.011974
>5/5 param=128.000000, loss=0.011488
>1/5 param=256.000000, loss=0.011780
>2/5 param=256.000000, loss=0.013215
>3/5 param=256.000000, loss=0.012355
>4/5 param=256.000000, loss=0.011390
>5/5 param=256.000000, loss=0.011595
2 4 8 32 64 128 256
count 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000 5.000000
mean 0.018261 0.012882 0.015311 0.011961 0.012175 0.012064 0.012067
std 0.001226 0.000573 0.000798 0.000769 0.000775 0.000554 0.000736
min 0.017062 0.012065 0.014112 0.011133 0.011532 0.011488 0.011390
25% 0.017288 0.012588 0.014913 0.011369 0.011609 0.011928 0.011595
50% 0.017816 0.012946 0.015701 0.011775 0.011951 0.011940 0.011780
75% 0.019510 0.013342 0.015724 0.012704 0.012349 0.011974 0.012355
max 0.019628 0.013468 0.016104 0.012824 0.013435 0.012988 0.013215
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_learning_rate(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=0.100000, loss=0.011195
>2/5 param=0.100000, loss=0.011809
>3/5 param=0.100000, loss=0.018709
>4/5 param=0.100000, loss=0.032624
>5/5 param=0.100000, loss=0.015232
>1/5 param=0.001000, loss=0.011950
>2/5 param=0.001000, loss=0.012022
>3/5 param=0.001000, loss=0.012852
>4/5 param=0.001000, loss=0.012499
>5/5 param=0.001000, loss=0.011699
>1/5 param=0.000100, loss=0.033569
>2/5 param=0.000100, loss=0.027969
>3/5 param=0.000100, loss=0.051960
>4/5 param=0.000100, loss=0.043928
>5/5 param=0.000100, loss=0.035140
0.1 0.001 0.0001
count 5.000000 5.000000 5.000000
mean 0.017914 0.012204 0.038513
std 0.008756 0.000464 0.009449
min 0.011195 0.011699 0.027969
25% 0.011809 0.011950 0.033569
50% 0.015232 0.012022 0.035140
75% 0.018709 0.012499 0.043928
max 0.032624 0.012852 0.051960
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_weight_regularization(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
%reload_ext autoreload
>1/5 param=1.000000, loss=0.017913
>2/5 param=1.000000, loss=0.017820
>3/5 param=1.000000, loss=0.017644
>4/5 param=1.000000, loss=0.018630
>5/5 param=1.000000, loss=0.018884
>1/5 param=2.000000, loss=0.033898
>2/5 param=2.000000, loss=0.035481
>3/5 param=2.000000, loss=0.036565
>4/5 param=2.000000, loss=0.036193
>5/5 param=2.000000, loss=0.035109
>1/5 param=3.000000, loss=0.012555
>2/5 param=3.000000, loss=0.011829
>3/5 param=3.000000, loss=0.012490
>4/5 param=3.000000, loss=0.011938
>5/5 param=3.000000, loss=0.011889
>1/5 param=4.000000, loss=0.037587
>2/5 param=4.000000, loss=0.039391
>3/5 param=4.000000, loss=0.038415
>4/5 param=4.000000, loss=0.039397
>5/5 param=4.000000, loss=0.038773
1 2 3 4
count 5.000000 5.000000 5.000000 5.000000
mean 0.018178 0.035449 0.012140 0.038713
std 0.000544 0.001040 0.000352 0.000756
min 0.017644 0.033898 0.011829 0.037587
25% 0.017820 0.035109 0.011889 0.038415
50% 0.017913 0.035481 0.011938 0.038773
75% 0.018630 0.036193 0.012490 0.039391
max 0.018884 0.036565 0.012555 0.039397
%load_ext autoreload
%autoreload 2
import models
model,history=models.improved_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload Train on 550 samples, validate on 53 samples Epoch 1/500 - 21s - loss: 0.7108 - val_loss: 0.5045 Epoch 2/500 - 0s - loss: 0.6787 - val_loss: 0.4682 Epoch 3/500 - 0s - loss: 0.6448 - val_loss: 0.4312 Epoch 4/500 - 0s - loss: 0.6109 - val_loss: 0.3942 Epoch 5/500 - 0s - loss: 0.5775 - val_loss: 0.3571 Epoch 6/500 - 0s - loss: 0.5447 - val_loss: 0.3210 Epoch 7/500 - 0s - loss: 0.5117 - val_loss: 0.2875 Epoch 8/500 - 0s - loss: 0.4798 - val_loss: 0.2657 Epoch 9/500 - 0s - loss: 0.4515 - val_loss: 0.2674 Epoch 10/500 - 0s - loss: 0.4297 - val_loss: 0.2824 Epoch 11/500 - 0s - loss: 0.4149 - val_loss: 0.2977 Epoch 12/500 - 0s - loss: 0.4052 - val_loss: 0.3093 Epoch 13/500 - 0s - loss: 0.3984 - val_loss: 0.3167 Epoch 14/500 - 0s - loss: 0.3934 - val_loss: 0.3195 Epoch 15/500 - 0s - loss: 0.3892 - val_loss: 0.3193 Epoch 16/500 - 0s - loss: 0.3854 - val_loss: 0.3171 Epoch 17/500 - 0s - loss: 0.3819 - val_loss: 0.3138 Epoch 18/500 - 0s - loss: 0.3786 - val_loss: 0.3097 Epoch 19/500 - 0s - loss: 0.3753 - val_loss: 0.3053 Epoch 20/500 - 0s - loss: 0.3721 - val_loss: 0.3009 Epoch 21/500 - 0s - loss: 0.3689 - val_loss: 0.2964 Epoch 22/500 - 0s - loss: 0.3657 - val_loss: 0.2918 Epoch 23/500 - 0s - loss: 0.3626 - val_loss: 0.2874 Epoch 24/500 - 0s - loss: 0.3595 - val_loss: 0.2833 Epoch 25/500 - 0s - loss: 0.3563 - val_loss: 0.2793 Epoch 26/500 - 0s - loss: 0.3531 - val_loss: 0.2754 Epoch 27/500 - 0s - loss: 0.3500 - val_loss: 0.2717 Epoch 28/500 - 0s - loss: 0.3468 - val_loss: 0.2682 Epoch 29/500 - 0s - loss: 0.3436 - val_loss: 0.2647 Epoch 30/500 - 0s - loss: 0.3404 - val_loss: 0.2614 Epoch 31/500 - 0s - loss: 0.3372 - val_loss: 0.2581 Epoch 32/500 - 0s - loss: 0.3340 - val_loss: 0.2549 Epoch 33/500 - 0s - loss: 0.3308 - val_loss: 0.2518 Epoch 34/500 - 0s - loss: 0.3276 - val_loss: 0.2488 Epoch 35/500 - 0s - loss: 0.3243 - val_loss: 0.2458 Epoch 36/500 - 0s - loss: 0.3211 - val_loss: 0.2430 Epoch 37/500 - 0s - loss: 0.3178 - val_loss: 0.2401 Epoch 38/500 - 0s - loss: 0.3145 - val_loss: 0.2371 Epoch 39/500 - 0s - loss: 0.3113 - val_loss: 0.2342 Epoch 40/500 - 0s - loss: 0.3080 - val_loss: 0.2312 Epoch 41/500 - 0s - loss: 0.3048 - val_loss: 0.2281 Epoch 42/500 - 0s - loss: 0.3015 - val_loss: 0.2252 Epoch 43/500 - 0s - loss: 0.2982 - val_loss: 0.2223 Epoch 44/500 - 0s - loss: 0.2949 - val_loss: 0.2195 Epoch 45/500 - 0s - loss: 0.2916 - val_loss: 0.2167 Epoch 46/500 - 0s - loss: 0.2883 - val_loss: 0.2139 Epoch 47/500 - 0s - loss: 0.2850 - val_loss: 0.2110 Epoch 48/500 - 0s - loss: 0.2816 - val_loss: 0.2081 Epoch 49/500 - 0s - loss: 0.2783 - val_loss: 0.2052 Epoch 50/500 - 0s - loss: 0.2749 - val_loss: 0.2022 Epoch 51/500 - 0s - loss: 0.2716 - val_loss: 0.1994 Epoch 52/500 - 0s - loss: 0.2682 - val_loss: 0.1968 Epoch 53/500 - 0s - loss: 0.2647 - val_loss: 0.1942 Epoch 54/500 - 0s - loss: 0.2612 - val_loss: 0.1915 Epoch 55/500 - 0s - loss: 0.2577 - val_loss: 0.1886 Epoch 56/500 - 0s - loss: 0.2542 - val_loss: 0.1859 Epoch 57/500 - 0s - loss: 0.2507 - val_loss: 0.1832 Epoch 58/500 - 0s - loss: 0.2471 - val_loss: 0.1805 Epoch 59/500 - 0s - loss: 0.2436 - val_loss: 0.1776 Epoch 60/500 - 0s - loss: 0.2400 - val_loss: 0.1746 Epoch 61/500 - 0s - loss: 0.2364 - val_loss: 0.1714 Epoch 62/500 - 0s - loss: 0.2328 - val_loss: 0.1683 Epoch 63/500 - 0s - loss: 0.2292 - val_loss: 0.1652 Epoch 64/500 - 0s - loss: 0.2256 - val_loss: 0.1621 Epoch 65/500 - 0s - loss: 0.2219 - val_loss: 0.1588 Epoch 66/500 - 0s - loss: 0.2182 - val_loss: 0.1555 Epoch 67/500 - 0s - loss: 0.2145 - val_loss: 0.1522 Epoch 68/500 - 0s - loss: 0.2107 - val_loss: 0.1496 Epoch 69/500 - 0s - loss: 0.2068 - val_loss: 0.1475 Epoch 70/500 - 0s - loss: 0.2028 - val_loss: 0.1457 Epoch 71/500 - 0s - loss: 0.1987 - val_loss: 0.1435 Epoch 72/500 - 0s - loss: 0.1948 - val_loss: 0.1407 Epoch 73/500 - 0s - loss: 0.1908 - val_loss: 0.1377 Epoch 74/500 - 0s - loss: 0.1870 - val_loss: 0.1345 Epoch 75/500 - 0s - loss: 0.1832 - val_loss: 0.1318 Epoch 76/500 - 0s - loss: 0.1792 - val_loss: 0.1297 Epoch 77/500 - 0s - loss: 0.1752 - val_loss: 0.1277 Epoch 78/500 - 0s - loss: 0.1713 - val_loss: 0.1256 Epoch 79/500 - 0s - loss: 0.1675 - val_loss: 0.1236 Epoch 80/500 - 0s - loss: 0.1639 - val_loss: 0.1215 Epoch 81/500 - 0s - loss: 0.1603 - val_loss: 0.1200 Epoch 82/500 - 0s - loss: 0.1568 - val_loss: 0.1190 Epoch 83/500 - 0s - loss: 0.1533 - val_loss: 0.1180 Epoch 84/500 - 0s - loss: 0.1500 - val_loss: 0.1170 Epoch 85/500 - 0s - loss: 0.1468 - val_loss: 0.1161 Epoch 86/500 - 0s - loss: 0.1439 - val_loss: 0.1155 Epoch 87/500 - 0s - loss: 0.1409 - val_loss: 0.1146 Epoch 88/500 - 0s - loss: 0.1380 - val_loss: 0.1136 Epoch 89/500 - 0s - loss: 0.1353 - val_loss: 0.1127 Epoch 90/500 - 0s - loss: 0.1329 - val_loss: 0.1123 Epoch 91/500 - 0s - loss: 0.1305 - val_loss: 0.1117 Epoch 92/500 - 0s - loss: 0.1281 - val_loss: 0.1106 Epoch 93/500 - 0s - loss: 0.1260 - val_loss: 0.1097 Epoch 94/500 - 0s - loss: 0.1240 - val_loss: 0.1091 Epoch 95/500 - 0s - loss: 0.1221 - val_loss: 0.1080 Epoch 96/500 - 0s - loss: 0.1203 - val_loss: 0.1069 Epoch 97/500 - 0s - loss: 0.1186 - val_loss: 0.1055 Epoch 98/500 - 0s - loss: 0.1170 - val_loss: 0.1043 Epoch 99/500 - 0s - loss: 0.1154 - val_loss: 0.1033 Epoch 100/500 - 0s - loss: 0.1138 - val_loss: 0.1020 Epoch 101/500 - 0s - loss: 0.1123 - val_loss: 0.1006 Epoch 102/500 - 0s - loss: 0.1109 - val_loss: 0.0990 Epoch 103/500 - 0s - loss: 0.1095 - val_loss: 0.0976 Epoch 104/500 - 0s - loss: 0.1081 - val_loss: 0.0964 Epoch 105/500 - 0s - loss: 0.1068 - val_loss: 0.0954 Epoch 106/500 - 0s - loss: 0.1054 - val_loss: 0.0941 Epoch 107/500 - 0s - loss: 0.1041 - val_loss: 0.0925 Epoch 108/500 - 0s - loss: 0.1029 - val_loss: 0.0910 Epoch 109/500 - 0s - loss: 0.1016 - val_loss: 0.0897 Epoch 110/500 - 0s - loss: 0.1004 - val_loss: 0.0885 Epoch 111/500 - 0s - loss: 0.0992 - val_loss: 0.0873 Epoch 112/500 - 0s - loss: 0.0981 - val_loss: 0.0858 Epoch 113/500 - 0s - loss: 0.0970 - val_loss: 0.0847 Epoch 114/500 - 0s - loss: 0.0959 - val_loss: 0.0837 Epoch 115/500 - 0s - loss: 0.0948 - val_loss: 0.0823 Epoch 116/500 - 0s - loss: 0.0937 - val_loss: 0.0811 Epoch 117/500 - 0s - loss: 0.0927 - val_loss: 0.0802 Epoch 118/500 - 0s - loss: 0.0917 - val_loss: 0.0791 Epoch 119/500 - 0s - loss: 0.0907 - val_loss: 0.0777 Epoch 120/500 - 0s - loss: 0.0897 - val_loss: 0.0765 Epoch 121/500 - 0s - loss: 0.0888 - val_loss: 0.0756 Epoch 122/500 - 0s - loss: 0.0878 - val_loss: 0.0745 Epoch 123/500 - 0s - loss: 0.0869 - val_loss: 0.0734 Epoch 124/500 - 0s - loss: 0.0860 - val_loss: 0.0725 Epoch 125/500 - 0s - loss: 0.0851 - val_loss: 0.0715 Epoch 126/500 - 0s - loss: 0.0842 - val_loss: 0.0704 Epoch 127/500 - 0s - loss: 0.0833 - val_loss: 0.0696 Epoch 128/500 - 0s - loss: 0.0825 - val_loss: 0.0687 Epoch 129/500 - 0s - loss: 0.0816 - val_loss: 0.0676 Epoch 130/500 - 0s - loss: 0.0808 - val_loss: 0.0666 Epoch 131/500 - 0s - loss: 0.0800 - val_loss: 0.0658 Epoch 132/500 - 0s - loss: 0.0792 - val_loss: 0.0650 Epoch 133/500 - 0s - loss: 0.0784 - val_loss: 0.0641 Epoch 134/500 - 0s - loss: 0.0776 - val_loss: 0.0631 Epoch 135/500 - 0s - loss: 0.0768 - val_loss: 0.0624 Epoch 136/500 - 0s - loss: 0.0761 - val_loss: 0.0616 Epoch 137/500 - 0s - loss: 0.0753 - val_loss: 0.0608 Epoch 138/500 - 0s - loss: 0.0746 - val_loss: 0.0600 Epoch 139/500 - 0s - loss: 0.0739 - val_loss: 0.0592 Epoch 140/500 - 0s - loss: 0.0732 - val_loss: 0.0585 Epoch 141/500 - 0s - loss: 0.0725 - val_loss: 0.0576 Epoch 142/500 - 0s - loss: 0.0718 - val_loss: 0.0569 Epoch 143/500 - 0s - loss: 0.0711 - val_loss: 0.0562 Epoch 144/500 - 0s - loss: 0.0704 - val_loss: 0.0554 Epoch 145/500 - 0s - loss: 0.0697 - val_loss: 0.0547 Epoch 146/500 - 0s - loss: 0.0691 - val_loss: 0.0541 Epoch 147/500 - 0s - loss: 0.0684 - val_loss: 0.0533 Epoch 148/500 - 0s - loss: 0.0678 - val_loss: 0.0526 Epoch 149/500 - 0s - loss: 0.0672 - val_loss: 0.0518 Epoch 150/500 - 0s - loss: 0.0665 - val_loss: 0.0512 Epoch 151/500 - 0s - loss: 0.0659 - val_loss: 0.0505 Epoch 152/500 - 0s - loss: 0.0653 - val_loss: 0.0499 Epoch 153/500 - 0s - loss: 0.0647 - val_loss: 0.0492 Epoch 154/500 - 0s - loss: 0.0641 - val_loss: 0.0486 Epoch 155/500 - 0s - loss: 0.0635 - val_loss: 0.0480 Epoch 156/500 - 0s - loss: 0.0630 - val_loss: 0.0474 Epoch 157/500 - 0s - loss: 0.0624 - val_loss: 0.0468 Epoch 158/500 - 0s - loss: 0.0618 - val_loss: 0.0462 Epoch 159/500 - 0s - loss: 0.0613 - val_loss: 0.0456 Epoch 160/500 - 0s - loss: 0.0607 - val_loss: 0.0450 Epoch 161/500 - 0s - loss: 0.0602 - val_loss: 0.0445 Epoch 162/500 - 0s - loss: 0.0597 - val_loss: 0.0440 Epoch 163/500 - 0s - loss: 0.0591 - val_loss: 0.0434 Epoch 164/500 - 0s - loss: 0.0586 - val_loss: 0.0428 Epoch 165/500 - 0s - loss: 0.0581 - val_loss: 0.0424 Epoch 166/500 - 0s - loss: 0.0576 - val_loss: 0.0419 Epoch 167/500 - 0s - loss: 0.0571 - val_loss: 0.0413 Epoch 168/500 - 0s - loss: 0.0566 - val_loss: 0.0408 Epoch 169/500 - 0s - loss: 0.0561 - val_loss: 0.0404 Epoch 170/500 - 0s - loss: 0.0556 - val_loss: 0.0398 Epoch 171/500 - 0s - loss: 0.0552 - val_loss: 0.0394 Epoch 172/500 - 0s - loss: 0.0547 - val_loss: 0.0389 Epoch 173/500 - 0s - loss: 0.0542 - val_loss: 0.0384 Epoch 174/500 - 0s - loss: 0.0538 - val_loss: 0.0380 Epoch 175/500 - 0s - loss: 0.0533 - val_loss: 0.0375 Epoch 176/500 - 0s - loss: 0.0529 - val_loss: 0.0372 Epoch 177/500 - 0s - loss: 0.0525 - val_loss: 0.0367 Epoch 178/500 - 0s - loss: 0.0520 - val_loss: 0.0362 Epoch 179/500 - 0s - loss: 0.0516 - val_loss: 0.0359 Epoch 180/500 - 0s - loss: 0.0512 - val_loss: 0.0354 Epoch 181/500 - 0s - loss: 0.0508 - val_loss: 0.0350 Epoch 182/500 - 0s - loss: 0.0504 - val_loss: 0.0348 Epoch 183/500 - 0s - loss: 0.0500 - val_loss: 0.0342 Epoch 184/500 - 0s - loss: 0.0496 - val_loss: 0.0339 Epoch 185/500 - 0s - loss: 0.0492 - val_loss: 0.0335 Epoch 186/500 - 0s - loss: 0.0488 - val_loss: 0.0331 Epoch 187/500 - 0s - loss: 0.0484 - val_loss: 0.0327 Epoch 188/500 - 0s - loss: 0.0480 - val_loss: 0.0324 Epoch 189/500 - 0s - loss: 0.0477 - val_loss: 0.0320 Epoch 190/500 - 0s - loss: 0.0473 - val_loss: 0.0316 Epoch 191/500 - 0s - loss: 0.0469 - val_loss: 0.0314 Epoch 192/500 - 0s - loss: 0.0466 - val_loss: 0.0309 Epoch 193/500 - 0s - loss: 0.0462 - val_loss: 0.0307 Epoch 194/500 - 0s - loss: 0.0459 - val_loss: 0.0303 Epoch 195/500 - 0s - loss: 0.0456 - val_loss: 0.0300 Epoch 196/500 - 0s - loss: 0.0452 - val_loss: 0.0296 Epoch 197/500 - 0s - loss: 0.0449 - val_loss: 0.0293 Epoch 198/500 - 0s - loss: 0.0446 - val_loss: 0.0291 Epoch 199/500 - 0s - loss: 0.0443 - val_loss: 0.0287 Epoch 200/500 - 0s - loss: 0.0439 - val_loss: 0.0285 Epoch 201/500 - 0s - loss: 0.0436 - val_loss: 0.0282 Epoch 202/500 - 0s - loss: 0.0433 - val_loss: 0.0278 Epoch 203/500 - 0s - loss: 0.0430 - val_loss: 0.0276 Epoch 204/500 - 0s - loss: 0.0427 - val_loss: 0.0273 Epoch 205/500 - 0s - loss: 0.0424 - val_loss: 0.0270 Epoch 206/500 - 0s - loss: 0.0422 - val_loss: 0.0268 Epoch 207/500 - 0s - loss: 0.0419 - val_loss: 0.0265 Epoch 208/500 - 0s - loss: 0.0416 - val_loss: 0.0262 Epoch 209/500 - 0s - loss: 0.0413 - val_loss: 0.0260 Epoch 210/500 - 0s - loss: 0.0410 - val_loss: 0.0258 Epoch 211/500 - 0s - loss: 0.0408 - val_loss: 0.0255 Epoch 212/500 - 0s - loss: 0.0405 - val_loss: 0.0252 Epoch 213/500 - 0s - loss: 0.0403 - val_loss: 0.0250 Epoch 214/500 - 0s - loss: 0.0400 - val_loss: 0.0248 Epoch 215/500 - 0s - loss: 0.0398 - val_loss: 0.0246 Epoch 216/500 - 0s - loss: 0.0395 - val_loss: 0.0243 Epoch 217/500 - 0s - loss: 0.0393 - val_loss: 0.0241 Epoch 218/500 - 0s - loss: 0.0390 - val_loss: 0.0239 Epoch 219/500 - 0s - loss: 0.0388 - val_loss: 0.0236 Epoch 220/500 - 0s - loss: 0.0385 - val_loss: 0.0234 Epoch 221/500 - 0s - loss: 0.0383 - val_loss: 0.0233 Epoch 222/500 - 0s - loss: 0.0381 - val_loss: 0.0230 Epoch 223/500 - 0s - loss: 0.0379 - val_loss: 0.0228 Epoch 224/500 - 0s - loss: 0.0376 - val_loss: 0.0226 Epoch 225/500 - 0s - loss: 0.0374 - val_loss: 0.0224 Epoch 226/500 - 0s - loss: 0.0372 - val_loss: 0.0223 Epoch 227/500 - 0s - loss: 0.0370 - val_loss: 0.0220 Epoch 228/500 - 0s - loss: 0.0368 - val_loss: 0.0218 Epoch 229/500 - 0s - loss: 0.0366 - val_loss: 0.0217 Epoch 230/500 - 0s - loss: 0.0364 - val_loss: 0.0215 Epoch 231/500 - 0s - loss: 0.0362 - val_loss: 0.0212 Epoch 232/500 - 0s - loss: 0.0360 - val_loss: 0.0212 Epoch 233/500 - 0s - loss: 0.0358 - val_loss: 0.0210 Epoch 234/500 - 0s - loss: 0.0357 - val_loss: 0.0207 Epoch 235/500 - 0s - loss: 0.0355 - val_loss: 0.0207 Epoch 236/500 - 0s - loss: 0.0353 - val_loss: 0.0205 Epoch 237/500 - 0s - loss: 0.0351 - val_loss: 0.0201 Epoch 238/500 - 0s - loss: 0.0349 - val_loss: 0.0202 Epoch 239/500 - 0s - loss: 0.0348 - val_loss: 0.0200 Epoch 240/500 - 0s - loss: 0.0346 - val_loss: 0.0198 Epoch 241/500 - 0s - loss: 0.0344 - val_loss: 0.0198 Epoch 242/500 - 0s - loss: 0.0343 - val_loss: 0.0195 Epoch 243/500 - 0s - loss: 0.0341 - val_loss: 0.0194 Epoch 244/500 - 0s - loss: 0.0339 - val_loss: 0.0193 Epoch 245/500 - 0s - loss: 0.0338 - val_loss: 0.0191 Epoch 246/500 - 0s - loss: 0.0336 - val_loss: 0.0190 Epoch 247/500 - 0s - loss: 0.0335 - val_loss: 0.0189 Epoch 248/500 - 0s - loss: 0.0333 - val_loss: 0.0188 Epoch 249/500 - 0s - loss: 0.0332 - val_loss: 0.0187 Epoch 250/500 - 0s - loss: 0.0330 - val_loss: 0.0185 Epoch 251/500 - 0s - loss: 0.0329 - val_loss: 0.0184 Epoch 252/500 - 0s - loss: 0.0328 - val_loss: 0.0183 Epoch 253/500 - 0s - loss: 0.0326 - val_loss: 0.0182 Epoch 254/500 - 0s - loss: 0.0325 - val_loss: 0.0181 Epoch 255/500 - 0s - loss: 0.0324 - val_loss: 0.0180 Epoch 256/500 - 0s - loss: 0.0322 - val_loss: 0.0178 Epoch 257/500 - 0s - loss: 0.0321 - val_loss: 0.0178 Epoch 258/500 - 0s - loss: 0.0320 - val_loss: 0.0177 Epoch 259/500 - 0s - loss: 0.0319 - val_loss: 0.0174 Epoch 260/500 - 0s - loss: 0.0317 - val_loss: 0.0174 Epoch 261/500 - 0s - loss: 0.0316 - val_loss: 0.0173 Epoch 262/500 - 0s - loss: 0.0315 - val_loss: 0.0172 Epoch 263/500 - 0s - loss: 0.0314 - val_loss: 0.0171 Epoch 264/500 - 0s - loss: 0.0313 - val_loss: 0.0169 Epoch 265/500 - 0s - loss: 0.0312 - val_loss: 0.0169 Epoch 266/500 - 0s - loss: 0.0311 - val_loss: 0.0168 Epoch 267/500 - 0s - loss: 0.0310 - val_loss: 0.0167 Epoch 268/500 - 0s - loss: 0.0308 - val_loss: 0.0166 Epoch 269/500 - 0s - loss: 0.0307 - val_loss: 0.0165 Epoch 270/500 - 0s - loss: 0.0306 - val_loss: 0.0164 Epoch 271/500 - 0s - loss: 0.0305 - val_loss: 0.0164 Epoch 272/500 - 0s - loss: 0.0305 - val_loss: 0.0162 Epoch 273/500 - 0s - loss: 0.0304 - val_loss: 0.0162 Epoch 274/500 - 0s - loss: 0.0303 - val_loss: 0.0161 Epoch 275/500 - 0s - loss: 0.0302 - val_loss: 0.0160 Epoch 276/500 - 0s - loss: 0.0301 - val_loss: 0.0160 Epoch 277/500 - 0s - loss: 0.0300 - val_loss: 0.0159 Epoch 278/500 - 0s - loss: 0.0299 - val_loss: 0.0158 Epoch 279/500 - 0s - loss: 0.0298 - val_loss: 0.0157 Epoch 280/500 - 0s - loss: 0.0297 - val_loss: 0.0157 Epoch 281/500 - 0s - loss: 0.0296 - val_loss: 0.0156 Epoch 282/500 - 0s - loss: 0.0296 - val_loss: 0.0156 Epoch 283/500 - 0s - loss: 0.0295 - val_loss: 0.0154 Epoch 284/500 - 0s - loss: 0.0294 - val_loss: 0.0154 Epoch 285/500 - 0s - loss: 0.0293 - val_loss: 0.0153 Epoch 286/500 - 0s - loss: 0.0293 - val_loss: 0.0153 Epoch 287/500 - 0s - loss: 0.0292 - val_loss: 0.0153 Epoch 288/500 - 0s - loss: 0.0291 - val_loss: 0.0151 Epoch 289/500 - 0s - loss: 0.0290 - val_loss: 0.0151 Epoch 290/500 - 0s - loss: 0.0290 - val_loss: 0.0150 Epoch 291/500 - 0s - loss: 0.0289 - val_loss: 0.0149 Epoch 292/500 - 0s - loss: 0.0288 - val_loss: 0.0150 Epoch 293/500 - 0s - loss: 0.0288 - val_loss: 0.0148 Epoch 294/500 - 0s - loss: 0.0287 - val_loss: 0.0147 Epoch 295/500 - 0s - loss: 0.0286 - val_loss: 0.0147 Epoch 296/500 - 0s - loss: 0.0286 - val_loss: 0.0147 Epoch 297/500 - 0s - loss: 0.0285 - val_loss: 0.0147 Epoch 298/500 - 0s - loss: 0.0285 - val_loss: 0.0146 Epoch 299/500 - 0s - loss: 0.0284 - val_loss: 0.0147 Epoch 300/500 - 0s - loss: 0.0284 - val_loss: 0.0144 Epoch 301/500 - 0s - loss: 0.0283 - val_loss: 0.0144 Epoch 302/500 - 0s - loss: 0.0282 - val_loss: 0.0145 Epoch 303/500 - 0s - loss: 0.0282 - val_loss: 0.0143 Epoch 304/500 - 0s - loss: 0.0281 - val_loss: 0.0144 Epoch 305/500 - 0s - loss: 0.0281 - val_loss: 0.0142 Epoch 306/500 - 0s - loss: 0.0280 - val_loss: 0.0143 Epoch 307/500 - 0s - loss: 0.0280 - val_loss: 0.0144 Epoch 308/500 - 0s - loss: 0.0279 - val_loss: 0.0141 Epoch 309/500 - 0s - loss: 0.0279 - val_loss: 0.0142 Epoch 310/500 - 0s - loss: 0.0278 - val_loss: 0.0140 Epoch 311/500 - 0s - loss: 0.0278 - val_loss: 0.0140 Epoch 312/500 - 0s - loss: 0.0277 - val_loss: 0.0142 Epoch 313/500 - 0s - loss: 0.0277 - val_loss: 0.0142 Epoch 314/500 - 0s - loss: 0.0277 - val_loss: 0.0141 Epoch 315/500 - 0s - loss: 0.0276 - val_loss: 0.0141 Epoch 316/500 - 0s - loss: 0.0276 - val_loss: 0.0139 Epoch 317/500 - 0s - loss: 0.0275 - val_loss: 0.0139 Epoch 318/500 - 0s - loss: 0.0275 - val_loss: 0.0140 Epoch 319/500 - 0s - loss: 0.0275 - val_loss: 0.0139 Epoch 320/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 321/500 - 0s - loss: 0.0274 - val_loss: 0.0139 Epoch 322/500 - 0s - loss: 0.0273 - val_loss: 0.0139 Epoch 323/500 - 0s - loss: 0.0273 - val_loss: 0.0139 Epoch 324/500 - 0s - loss: 0.0273 - val_loss: 0.0138 Epoch 325/500 - 0s - loss: 0.0272 - val_loss: 0.0138 Epoch 326/500 - 0s - loss: 0.0272 - val_loss: 0.0138 Epoch 327/500 - 0s - loss: 0.0272 - val_loss: 0.0137 Epoch 328/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 329/500 - 0s - loss: 0.0271 - val_loss: 0.0137 Epoch 330/500 - 0s - loss: 0.0271 - val_loss: 0.0136 Epoch 331/500 - 0s - loss: 0.0270 - val_loss: 0.0135 Epoch 332/500 - 0s - loss: 0.0270 - val_loss: 0.0137 Epoch 333/500 - 0s - loss: 0.0270 - val_loss: 0.0135 Epoch 334/500 - 0s - loss: 0.0270 - val_loss: 0.0136 Epoch 335/500 - 0s - loss: 0.0269 - val_loss: 0.0134 Epoch 336/500 - 0s - loss: 0.0269 - val_loss: 0.0136 Epoch 337/500 - 0s - loss: 0.0269 - val_loss: 0.0133 Epoch 338/500 - 0s - loss: 0.0269 - val_loss: 0.0136 Epoch 339/500 - 0s - loss: 0.0268 - val_loss: 0.0134 Epoch 340/500 - 0s - loss: 0.0268 - val_loss: 0.0132 Epoch 341/500 - 0s - loss: 0.0268 - val_loss: 0.0132 Epoch 342/500 - 0s - loss: 0.0268 - val_loss: 0.0133 Epoch 343/500 - 0s - loss: 0.0267 - val_loss: 0.0133 Epoch 344/500 - 0s - loss: 0.0267 - val_loss: 0.0132 Epoch 345/500 - 0s - loss: 0.0267 - val_loss: 0.0133 Epoch 346/500 - 0s - loss: 0.0267 - val_loss: 0.0130 Epoch 347/500 - 0s - loss: 0.0266 - val_loss: 0.0132 Epoch 348/500 - 0s - loss: 0.0266 - val_loss: 0.0131 Epoch 349/500 - 0s - loss: 0.0266 - val_loss: 0.0132 Epoch 350/500 - 0s - loss: 0.0266 - val_loss: 0.0132 Epoch 351/500 - 0s - loss: 0.0266 - val_loss: 0.0132 Epoch 352/500 - 0s - loss: 0.0265 - val_loss: 0.0131 Epoch 353/500 - 0s - loss: 0.0265 - val_loss: 0.0130 Epoch 354/500 - 0s - loss: 0.0265 - val_loss: 0.0131 Epoch 355/500 - 0s - loss: 0.0265 - val_loss: 0.0131 Epoch 356/500 - 0s - loss: 0.0265 - val_loss: 0.0130 Epoch 357/500 - 0s - loss: 0.0265 - val_loss: 0.0129 Epoch 358/500 - 0s - loss: 0.0264 - val_loss: 0.0130 Epoch 359/500 - 0s - loss: 0.0264 - val_loss: 0.0129 Epoch 360/500 - 0s - loss: 0.0264 - val_loss: 0.0129 Epoch 361/500 - 0s - loss: 0.0264 - val_loss: 0.0128 Epoch 362/500 - 0s - loss: 0.0264 - val_loss: 0.0128 Epoch 363/500 - 0s - loss: 0.0264 - val_loss: 0.0127 Epoch 364/500 - 0s - loss: 0.0264 - val_loss: 0.0129 Epoch 365/500 - 0s - loss: 0.0263 - val_loss: 0.0127 Epoch 366/500 - 0s - loss: 0.0263 - val_loss: 0.0128 Epoch 367/500 - 0s - loss: 0.0263 - val_loss: 0.0128 Epoch 368/500 - 0s - loss: 0.0263 - val_loss: 0.0128 Epoch 369/500 - 0s - loss: 0.0263 - val_loss: 0.0128 Epoch 370/500 - 0s - loss: 0.0263 - val_loss: 0.0127 Epoch 371/500 - 0s - loss: 0.0263 - val_loss: 0.0129 Epoch 372/500 - 0s - loss: 0.0263 - val_loss: 0.0126 Epoch 373/500 - 0s - loss: 0.0262 - val_loss: 0.0129 Epoch 374/500 - 0s - loss: 0.0262 - val_loss: 0.0126 Epoch 375/500 - 0s - loss: 0.0262 - val_loss: 0.0128 Epoch 376/500 - 0s - loss: 0.0262 - val_loss: 0.0127 Epoch 377/500 - 0s - loss: 0.0262 - val_loss: 0.0127 Epoch 378/500 - 0s - loss: 0.0262 - val_loss: 0.0127 Epoch 379/500 - 0s - loss: 0.0262 - val_loss: 0.0126 Epoch 380/500 - 0s - loss: 0.0262 - val_loss: 0.0126 Epoch 381/500 - 0s - loss: 0.0262 - val_loss: 0.0125 Epoch 382/500 - 0s - loss: 0.0261 - val_loss: 0.0126 Epoch 383/500 - 0s - loss: 0.0261 - val_loss: 0.0124 Epoch 384/500 - 0s - loss: 0.0261 - val_loss: 0.0127 Epoch 385/500 - 0s - loss: 0.0261 - val_loss: 0.0126 Epoch 386/500 - 0s - loss: 0.0261 - val_loss: 0.0126 Epoch 387/500 - 0s - loss: 0.0261 - val_loss: 0.0126 Epoch 388/500 - 0s - loss: 0.0261 - val_loss: 0.0126 Epoch 389/500 - 0s - loss: 0.0261 - val_loss: 0.0124 Epoch 390/500 - 0s - loss: 0.0261 - val_loss: 0.0125 Epoch 391/500 - 0s - loss: 0.0261 - val_loss: 0.0124 Epoch 392/500 - 0s - loss: 0.0260 - val_loss: 0.0125 Epoch 393/500 - 0s - loss: 0.0260 - val_loss: 0.0124 Epoch 394/500 - 0s - loss: 0.0260 - val_loss: 0.0125 Epoch 395/500 - 0s - loss: 0.0260 - val_loss: 0.0126 Epoch 396/500 - 0s - loss: 0.0260 - val_loss: 0.0125 Epoch 397/500 - 0s - loss: 0.0260 - val_loss: 0.0124 Epoch 398/500 - 0s - loss: 0.0260 - val_loss: 0.0125 Epoch 399/500 - 0s - loss: 0.0260 - val_loss: 0.0123 Epoch 400/500 - 0s - loss: 0.0260 - val_loss: 0.0126 Epoch 401/500 - 0s - loss: 0.0260 - val_loss: 0.0124 Epoch 402/500 - 0s - loss: 0.0260 - val_loss: 0.0126 Epoch 403/500 - 0s - loss: 0.0260 - val_loss: 0.0125 Epoch 404/500 - 0s - loss: 0.0260 - val_loss: 0.0124 Epoch 405/500 - 0s - loss: 0.0260 - val_loss: 0.0124 Epoch 406/500 - 0s - loss: 0.0260 - val_loss: 0.0123 Epoch 407/500 - 0s - loss: 0.0259 - val_loss: 0.0125 Epoch 408/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 409/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 410/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 411/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 412/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 413/500 - 0s - loss: 0.0259 - val_loss: 0.0125 Epoch 414/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 415/500 - 0s - loss: 0.0259 - val_loss: 0.0125 Epoch 416/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 417/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 418/500 - 0s - loss: 0.0259 - val_loss: 0.0123 Epoch 419/500 - 0s - loss: 0.0259 - val_loss: 0.0121 Epoch 420/500 - 0s - loss: 0.0259 - val_loss: 0.0124 Epoch 421/500 - 0s - loss: 0.0259 - val_loss: 0.0121 Epoch 422/500 - 0s - loss: 0.0258 - val_loss: 0.0124 Epoch 423/500 - 0s - loss: 0.0259 - val_loss: 0.0122 Epoch 424/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 425/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 426/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 427/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 428/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 429/500 - 0s - loss: 0.0259 - val_loss: 0.0121 Epoch 430/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 431/500 - 0s - loss: 0.0258 - val_loss: 0.0119 Epoch 432/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 433/500 - 0s - loss: 0.0258 - val_loss: 0.0120 Epoch 434/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 435/500 - 0s - loss: 0.0258 - val_loss: 0.0121 Epoch 436/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 437/500 - 0s - loss: 0.0258 - val_loss: 0.0121 Epoch 438/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 439/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 440/500 - 0s - loss: 0.0258 - val_loss: 0.0122 Epoch 441/500 - 0s - loss: 0.0258 - val_loss: 0.0123 Epoch 442/500 - 0s - loss: 0.0258 - val_loss: 0.0121 Epoch 443/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 444/500 - 0s - loss: 0.0258 - val_loss: 0.0120 Epoch 445/500 - 0s - loss: 0.0257 - val_loss: 0.0123 Epoch 446/500 - 0s - loss: 0.0258 - val_loss: 0.0119 Epoch 447/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 448/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 449/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 450/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 451/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 452/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 453/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 454/500 - 0s - loss: 0.0258 - val_loss: 0.0121 Epoch 455/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 456/500 - 0s - loss: 0.0258 - val_loss: 0.0119 Epoch 457/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 458/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 459/500 - 0s - loss: 0.0257 - val_loss: 0.0122 Epoch 460/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 461/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 462/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 463/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 464/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 465/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 466/500 - 0s - loss: 0.0258 - val_loss: 0.0119 Epoch 467/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 468/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 469/500 - 0s - loss: 0.0257 - val_loss: 0.0121 Epoch 470/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 471/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 472/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 473/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 474/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 475/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 476/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 477/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 478/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 479/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 480/500 - 0s - loss: 0.0257 - val_loss: 0.0119 Epoch 481/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 482/500 - 0s - loss: 0.0257 - val_loss: 0.0118 Epoch 483/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 484/500 - 0s - loss: 0.0256 - val_loss: 0.0118 Epoch 485/500 - 0s - loss: 0.0256 - val_loss: 0.0119 Epoch 486/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 487/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 488/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 489/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 490/500 - 0s - loss: 0.0257 - val_loss: 0.0120 Epoch 491/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 492/500 - 0s - loss: 0.0257 - val_loss: 0.0118 Epoch 493/500 - 0s - loss: 0.0256 - val_loss: 0.0120 Epoch 494/500 - 0s - loss: 0.0257 - val_loss: 0.0117 Epoch 495/500 - 0s - loss: 0.0256 - val_loss: 0.0121 Epoch 496/500 - 0s - loss: 0.0257 - val_loss: 0.0117 Epoch 497/500 - 0s - loss: 0.0256 - val_loss: 0.0122 Epoch 498/500 - 0s - loss: 0.0256 - val_loss: 0.0117 Epoch 499/500 - 0s - loss: 0.0255 - val_loss: 0.0120 Epoch 500/500 - 0s - loss: 0.0256 - val_loss: 0.0117
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Model on Validation Data RMSE: 8.984
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload LSTM Moddel on Test Data RMSE: 8.709